Oct 06 14:53:21 crc systemd[1]: Starting Kubernetes Kubelet... Oct 06 14:53:21 crc restorecon[4752]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:21 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:22 crc restorecon[4752]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 14:53:22 crc restorecon[4752]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 06 14:53:23 crc kubenswrapper[4763]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 14:53:23 crc kubenswrapper[4763]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 06 14:53:23 crc kubenswrapper[4763]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 14:53:23 crc kubenswrapper[4763]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 14:53:23 crc kubenswrapper[4763]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 06 14:53:23 crc kubenswrapper[4763]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.297135 4763 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306273 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306320 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306333 4763 feature_gate.go:330] unrecognized feature gate: Example Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306348 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306362 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306374 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306386 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306397 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306409 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306419 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306429 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306439 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306450 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306460 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306469 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306477 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306486 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306493 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306501 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306509 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306516 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306525 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306533 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306541 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306550 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306557 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306566 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306573 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306581 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306589 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306597 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306605 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306675 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306698 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306708 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306717 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306725 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306734 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306748 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306761 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306770 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306779 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306788 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306796 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306805 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306813 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306823 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306831 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306839 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306847 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306858 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306869 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306880 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306891 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306902 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306912 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306933 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306946 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306959 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306969 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306979 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306989 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.306999 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.307008 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.307022 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.307030 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.307038 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.307046 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.307054 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.307064 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.307071 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307237 4763 flags.go:64] FLAG: --address="0.0.0.0" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307256 4763 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307274 4763 flags.go:64] FLAG: --anonymous-auth="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307299 4763 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307312 4763 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307322 4763 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307335 4763 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307346 4763 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307356 4763 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307365 4763 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307375 4763 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307387 4763 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307398 4763 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307409 4763 flags.go:64] FLAG: --cgroup-root="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307420 4763 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307431 4763 flags.go:64] FLAG: --client-ca-file="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307442 4763 flags.go:64] FLAG: --cloud-config="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307453 4763 flags.go:64] FLAG: --cloud-provider="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307465 4763 flags.go:64] FLAG: --cluster-dns="[]" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307479 4763 flags.go:64] FLAG: --cluster-domain="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307490 4763 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307501 4763 flags.go:64] FLAG: --config-dir="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307512 4763 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307525 4763 flags.go:64] FLAG: --container-log-max-files="5" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307536 4763 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307545 4763 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307554 4763 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307563 4763 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307571 4763 flags.go:64] FLAG: --contention-profiling="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307578 4763 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307585 4763 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307592 4763 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307597 4763 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307605 4763 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307612 4763 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307660 4763 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307672 4763 flags.go:64] FLAG: --enable-load-reader="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307681 4763 flags.go:64] FLAG: --enable-server="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307687 4763 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307699 4763 flags.go:64] FLAG: --event-burst="100" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307707 4763 flags.go:64] FLAG: --event-qps="50" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307714 4763 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307721 4763 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307728 4763 flags.go:64] FLAG: --eviction-hard="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307738 4763 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307745 4763 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307752 4763 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307762 4763 flags.go:64] FLAG: --eviction-soft="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307769 4763 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307777 4763 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307784 4763 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307791 4763 flags.go:64] FLAG: --experimental-mounter-path="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307798 4763 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307805 4763 flags.go:64] FLAG: --fail-swap-on="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307812 4763 flags.go:64] FLAG: --feature-gates="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307822 4763 flags.go:64] FLAG: --file-check-frequency="20s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307829 4763 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307837 4763 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307845 4763 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307852 4763 flags.go:64] FLAG: --healthz-port="10248" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307859 4763 flags.go:64] FLAG: --help="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307865 4763 flags.go:64] FLAG: --hostname-override="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307914 4763 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307925 4763 flags.go:64] FLAG: --http-check-frequency="20s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307933 4763 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307940 4763 flags.go:64] FLAG: --image-credential-provider-config="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307947 4763 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307955 4763 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307961 4763 flags.go:64] FLAG: --image-service-endpoint="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307967 4763 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307973 4763 flags.go:64] FLAG: --kube-api-burst="100" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307979 4763 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307985 4763 flags.go:64] FLAG: --kube-api-qps="50" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307990 4763 flags.go:64] FLAG: --kube-reserved="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.307996 4763 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308002 4763 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308008 4763 flags.go:64] FLAG: --kubelet-cgroups="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308013 4763 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308018 4763 flags.go:64] FLAG: --lock-file="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308024 4763 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308029 4763 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308035 4763 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308045 4763 flags.go:64] FLAG: --log-json-split-stream="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308052 4763 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308058 4763 flags.go:64] FLAG: --log-text-split-stream="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308064 4763 flags.go:64] FLAG: --logging-format="text" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308070 4763 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308076 4763 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308081 4763 flags.go:64] FLAG: --manifest-url="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308086 4763 flags.go:64] FLAG: --manifest-url-header="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308095 4763 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308101 4763 flags.go:64] FLAG: --max-open-files="1000000" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308108 4763 flags.go:64] FLAG: --max-pods="110" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308114 4763 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308119 4763 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308125 4763 flags.go:64] FLAG: --memory-manager-policy="None" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308131 4763 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308136 4763 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308142 4763 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308147 4763 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308164 4763 flags.go:64] FLAG: --node-status-max-images="50" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308170 4763 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308176 4763 flags.go:64] FLAG: --oom-score-adj="-999" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308182 4763 flags.go:64] FLAG: --pod-cidr="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308187 4763 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308195 4763 flags.go:64] FLAG: --pod-manifest-path="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308201 4763 flags.go:64] FLAG: --pod-max-pids="-1" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308207 4763 flags.go:64] FLAG: --pods-per-core="0" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308212 4763 flags.go:64] FLAG: --port="10250" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308218 4763 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308223 4763 flags.go:64] FLAG: --provider-id="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308229 4763 flags.go:64] FLAG: --qos-reserved="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308234 4763 flags.go:64] FLAG: --read-only-port="10255" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308240 4763 flags.go:64] FLAG: --register-node="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308245 4763 flags.go:64] FLAG: --register-schedulable="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308250 4763 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308261 4763 flags.go:64] FLAG: --registry-burst="10" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308266 4763 flags.go:64] FLAG: --registry-qps="5" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308272 4763 flags.go:64] FLAG: --reserved-cpus="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308278 4763 flags.go:64] FLAG: --reserved-memory="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308286 4763 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308292 4763 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308298 4763 flags.go:64] FLAG: --rotate-certificates="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308303 4763 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308309 4763 flags.go:64] FLAG: --runonce="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308314 4763 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308324 4763 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308331 4763 flags.go:64] FLAG: --seccomp-default="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308336 4763 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308341 4763 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308347 4763 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308353 4763 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308358 4763 flags.go:64] FLAG: --storage-driver-password="root" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308364 4763 flags.go:64] FLAG: --storage-driver-secure="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308369 4763 flags.go:64] FLAG: --storage-driver-table="stats" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308375 4763 flags.go:64] FLAG: --storage-driver-user="root" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308380 4763 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308386 4763 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308392 4763 flags.go:64] FLAG: --system-cgroups="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308398 4763 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308407 4763 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308412 4763 flags.go:64] FLAG: --tls-cert-file="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308418 4763 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308425 4763 flags.go:64] FLAG: --tls-min-version="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308430 4763 flags.go:64] FLAG: --tls-private-key-file="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308436 4763 flags.go:64] FLAG: --topology-manager-policy="none" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308441 4763 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308447 4763 flags.go:64] FLAG: --topology-manager-scope="container" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308452 4763 flags.go:64] FLAG: --v="2" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308460 4763 flags.go:64] FLAG: --version="false" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308468 4763 flags.go:64] FLAG: --vmodule="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308476 4763 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.308482 4763 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308687 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308696 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308703 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308709 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308715 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308723 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308728 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308733 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308738 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308743 4763 feature_gate.go:330] unrecognized feature gate: Example Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308747 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308753 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308757 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308762 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308769 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308775 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308781 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308787 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308792 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308797 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308803 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308809 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308814 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308819 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308824 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308829 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308834 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308839 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308843 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308850 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308856 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308862 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308867 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308872 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308876 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308881 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308886 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308893 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308899 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308904 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308908 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308913 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308918 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308923 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308928 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308933 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308937 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308942 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308947 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308952 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308957 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308962 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308966 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308971 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308976 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308982 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308986 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308991 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.308996 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309001 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309005 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309010 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309015 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309020 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309024 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309029 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309033 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309038 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309043 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309049 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.309054 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.309071 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.321495 4763 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.321534 4763 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321691 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321705 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321715 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321724 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321734 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321743 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321751 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321759 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321767 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321775 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321783 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321791 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321798 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321806 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321814 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321821 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321829 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321839 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321850 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321860 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321867 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321875 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321883 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321891 4763 feature_gate.go:330] unrecognized feature gate: Example Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321899 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321909 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321918 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321926 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321934 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321943 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321951 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321960 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321968 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321975 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321986 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.321994 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322002 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322009 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322017 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322025 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322033 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322040 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322048 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322056 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322063 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322074 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322083 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322091 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322101 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322109 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322117 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322125 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322132 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322139 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322147 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322154 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322163 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322170 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322178 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322185 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322193 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322201 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322209 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322217 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322228 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322237 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322246 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322254 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322262 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322270 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322280 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.322293 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322520 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322531 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322540 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322548 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322558 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322567 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322575 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322583 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322591 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322599 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322607 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322642 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322653 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322663 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322672 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322680 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322688 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322697 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322705 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322713 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322721 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322729 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322736 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322746 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322755 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322763 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322772 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322779 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322787 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322794 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322802 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322810 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322817 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322827 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322839 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322849 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322858 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322867 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322876 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322884 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322892 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322901 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322908 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322916 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322924 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322932 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322939 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322947 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322955 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322962 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322970 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322978 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322986 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.322996 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323005 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323014 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323023 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323032 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323040 4763 feature_gate.go:330] unrecognized feature gate: Example Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323048 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323056 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323064 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323072 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323080 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323088 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323096 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323104 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323112 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323120 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323128 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.323138 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.323150 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.324306 4763 server.go:940] "Client rotation is on, will bootstrap in background" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.332136 4763 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.332310 4763 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.334169 4763 server.go:997] "Starting client certificate rotation" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.334218 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.335318 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-03 02:34:53.585572388 +0000 UTC Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.335450 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1379h41m30.250127371s for next certificate rotation Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.364464 4763 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.368807 4763 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.387530 4763 log.go:25] "Validated CRI v1 runtime API" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.428435 4763 log.go:25] "Validated CRI v1 image API" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.431010 4763 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.438227 4763 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-06-14-41-05-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.438274 4763 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.468513 4763 manager.go:217] Machine: {Timestamp:2025-10-06 14:53:23.464739942 +0000 UTC m=+0.620032544 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5648b82a-0ebd-488c-add6-0c62e287c376 BootID:8a45ad59-aebd-449e-8dda-9594cfe75912 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:45:1e:f4 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:45:1e:f4 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:03:f9:9f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4f:fe:95 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:28:50:97 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:17:11:96 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:fd:31:04 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:fe:d6:4f:bd:a5:b3 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:06:63:31:51:0e:49 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.468986 4763 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.469279 4763 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.471828 4763 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.472162 4763 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.472217 4763 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.472573 4763 topology_manager.go:138] "Creating topology manager with none policy" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.472591 4763 container_manager_linux.go:303] "Creating device plugin manager" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.473568 4763 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.473702 4763 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.474041 4763 state_mem.go:36] "Initialized new in-memory state store" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.474209 4763 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.478389 4763 kubelet.go:418] "Attempting to sync node with API server" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.478433 4763 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.478499 4763 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.478521 4763 kubelet.go:324] "Adding apiserver pod source" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.478540 4763 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.482963 4763 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.483951 4763 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.485991 4763 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.486939 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.487027 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.487102 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.487150 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.487761 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.487818 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.487839 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.487857 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.487886 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.487904 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.487923 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.487951 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.487974 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.487991 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.488018 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.488035 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.490400 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.491288 4763 server.go:1280] "Started kubelet" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.492767 4763 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.492769 4763 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.493300 4763 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 06 14:53:23 crc systemd[1]: Started Kubernetes Kubelet. Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.493842 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.497159 4763 server.go:460] "Adding debug handlers to kubelet server" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.497666 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.497746 4763 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.497945 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 02:43:53.543238346 +0000 UTC Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.498110 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1571h50m30.045135979s for next certificate rotation Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.498249 4763 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.498292 4763 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.498470 4763 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.498602 4763 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.499696 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="200ms" Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.503454 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.503575 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.504002 4763 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.504022 4763 factory.go:55] Registering systemd factory Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.504038 4763 factory.go:221] Registration of the systemd container factory successfully Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.504472 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.233:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186bee95152ebae7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-06 14:53:23.491232487 +0000 UTC m=+0.646525039,LastTimestamp:2025-10-06 14:53:23.491232487 +0000 UTC m=+0.646525039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.506145 4763 factory.go:153] Registering CRI-O factory Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.506183 4763 factory.go:221] Registration of the crio container factory successfully Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.506216 4763 factory.go:103] Registering Raw factory Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.506232 4763 manager.go:1196] Started watching for new ooms in manager Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.510944 4763 manager.go:319] Starting recovery of all containers Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515068 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515141 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515165 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515185 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515203 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515221 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515239 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515256 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515279 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515296 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515314 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515332 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515350 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515373 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515399 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515417 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515434 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515452 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515468 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515487 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515506 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515523 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515540 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515557 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515576 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515594 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515677 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515722 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515742 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515760 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515781 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515799 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515817 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515838 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515855 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515873 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515892 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515909 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515928 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515948 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515968 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.515985 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516003 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516023 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516042 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516061 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516080 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516097 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516115 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516132 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516150 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516167 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516190 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516210 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516228 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516246 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516299 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516320 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516337 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516354 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516372 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516390 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516406 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516427 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516443 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516462 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516483 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516504 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516521 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516538 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516557 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516577 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516600 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516645 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516740 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516760 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516778 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516801 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516819 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516837 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516855 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516874 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516891 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516908 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516926 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516944 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516964 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.516983 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517004 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517023 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517041 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517060 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517080 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517107 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517125 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517153 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517172 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517190 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517210 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517229 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517247 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517264 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517282 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517301 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517327 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517347 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517367 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517389 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517409 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517427 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517445 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517465 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517484 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517503 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517574 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517595 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.517660 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520444 4763 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520483 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520503 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520516 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520529 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520541 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520553 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520565 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520577 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520590 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520602 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520629 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520642 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520654 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520666 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520677 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520689 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520700 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520710 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520723 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520733 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520745 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520755 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520767 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520820 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520831 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520841 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520852 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520862 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520873 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520883 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520930 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520943 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520955 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520968 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520981 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.520995 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521007 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521019 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521031 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521044 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521056 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521067 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521081 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521093 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521104 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521114 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521125 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521136 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521147 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521157 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521168 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521179 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521190 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521199 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521209 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521218 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521228 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521239 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521249 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521270 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521280 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521290 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521300 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521309 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521321 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521331 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521344 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521354 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521364 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521375 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521385 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521396 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521404 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521414 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521425 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521436 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521447 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521456 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521467 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521478 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521490 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521501 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521511 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521521 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521531 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521541 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521554 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521564 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521575 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521585 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521593 4763 reconstruct.go:97] "Volume reconstruction finished" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.521600 4763 reconciler.go:26] "Reconciler: start to sync state" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.542728 4763 manager.go:324] Recovery completed Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.555112 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.557215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.557603 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.557619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.561552 4763 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.561575 4763 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.561659 4763 state_mem.go:36] "Initialized new in-memory state store" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.571647 4763 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.573468 4763 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.573508 4763 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.573743 4763 kubelet.go:2335] "Starting kubelet main sync loop" Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.574029 4763 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 06 14:53:23 crc kubenswrapper[4763]: W1006 14:53:23.575123 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.575191 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.580659 4763 policy_none.go:49] "None policy: Start" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.581339 4763 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.581368 4763 state_mem.go:35] "Initializing new in-memory state store" Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.599184 4763 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.638919 4763 manager.go:334] "Starting Device Plugin manager" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.638980 4763 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.638997 4763 server.go:79] "Starting device plugin registration server" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.639483 4763 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.639524 4763 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.639603 4763 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.639717 4763 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.639727 4763 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.645936 4763 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.674131 4763 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.674292 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.675465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.675503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.675515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.675831 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.676082 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.676126 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.676927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.676960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.676975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.677011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.677063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.677074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.677225 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.677363 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.677392 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.678102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.678134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.678145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.678252 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.678408 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.678440 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.678449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.678500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.678518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.679396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.679432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.679444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.679555 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.679400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.679657 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.679674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.679976 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.680009 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.680607 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.680666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.680680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.680685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.680725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.680741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.680847 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.680873 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.681769 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.681811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.681830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.703725 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="400ms" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.725765 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.725870 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.725920 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.725964 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726009 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726051 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726089 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726129 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726171 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726213 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726357 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726425 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726476 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.726531 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.740303 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.741592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.741690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.741712 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.741756 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.742365 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827368 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827460 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827508 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827542 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827577 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827610 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827670 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827704 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827721 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827735 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827798 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827736 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827657 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827850 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827869 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827896 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827902 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827838 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827913 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827951 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827999 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.828009 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.828030 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827968 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.828059 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.828080 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.828100 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.827858 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.828191 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.942926 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.944796 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.944831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.944842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.944866 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 14:53:23 crc kubenswrapper[4763]: E1006 14:53:23.945289 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 06 14:53:23 crc kubenswrapper[4763]: I1006 14:53:23.999481 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.009807 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.029141 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.050182 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.057293 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 14:53:24 crc kubenswrapper[4763]: W1006 14:53:24.060492 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c871687ec7dc6e64c9c6203b772b70c48111be3b74ec3676dd6a91903b06c2bc WatchSource:0}: Error finding container c871687ec7dc6e64c9c6203b772b70c48111be3b74ec3676dd6a91903b06c2bc: Status 404 returned error can't find the container with id c871687ec7dc6e64c9c6203b772b70c48111be3b74ec3676dd6a91903b06c2bc Oct 06 14:53:24 crc kubenswrapper[4763]: W1006 14:53:24.064072 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-4af20bd22def5d692d70ab83230681b59a3af6075d1066a7980cdb2a9d1fdb8e WatchSource:0}: Error finding container 4af20bd22def5d692d70ab83230681b59a3af6075d1066a7980cdb2a9d1fdb8e: Status 404 returned error can't find the container with id 4af20bd22def5d692d70ab83230681b59a3af6075d1066a7980cdb2a9d1fdb8e Oct 06 14:53:24 crc kubenswrapper[4763]: W1006 14:53:24.078825 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8d6dd7aab701436ac34fc69ce4dc99865f8b2c6e95e819f58a21764aab5615dd WatchSource:0}: Error finding container 8d6dd7aab701436ac34fc69ce4dc99865f8b2c6e95e819f58a21764aab5615dd: Status 404 returned error can't find the container with id 8d6dd7aab701436ac34fc69ce4dc99865f8b2c6e95e819f58a21764aab5615dd Oct 06 14:53:24 crc kubenswrapper[4763]: W1006 14:53:24.088486 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7144271a1672fd1921a36477a19b4f92598b492df5702dca40a01c21efb40c8d WatchSource:0}: Error finding container 7144271a1672fd1921a36477a19b4f92598b492df5702dca40a01c21efb40c8d: Status 404 returned error can't find the container with id 7144271a1672fd1921a36477a19b4f92598b492df5702dca40a01c21efb40c8d Oct 06 14:53:24 crc kubenswrapper[4763]: W1006 14:53:24.089844 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7eae7b5e83bee936b1ea1c9fb77a947121d7c8a0034b62fb24de60105d4f2b91 WatchSource:0}: Error finding container 7eae7b5e83bee936b1ea1c9fb77a947121d7c8a0034b62fb24de60105d4f2b91: Status 404 returned error can't find the container with id 7eae7b5e83bee936b1ea1c9fb77a947121d7c8a0034b62fb24de60105d4f2b91 Oct 06 14:53:24 crc kubenswrapper[4763]: E1006 14:53:24.105217 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="800ms" Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.345788 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.347693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.347736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.347746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.347773 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 14:53:24 crc kubenswrapper[4763]: E1006 14:53:24.348280 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 06 14:53:24 crc kubenswrapper[4763]: E1006 14:53:24.374227 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.233:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186bee95152ebae7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-06 14:53:23.491232487 +0000 UTC m=+0.646525039,LastTimestamp:2025-10-06 14:53:23.491232487 +0000 UTC m=+0.646525039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 06 14:53:24 crc kubenswrapper[4763]: W1006 14:53:24.427298 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:24 crc kubenswrapper[4763]: E1006 14:53:24.427405 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 06 14:53:24 crc kubenswrapper[4763]: W1006 14:53:24.440907 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:24 crc kubenswrapper[4763]: E1006 14:53:24.440997 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.495450 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.578431 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c871687ec7dc6e64c9c6203b772b70c48111be3b74ec3676dd6a91903b06c2bc"} Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.579968 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7eae7b5e83bee936b1ea1c9fb77a947121d7c8a0034b62fb24de60105d4f2b91"} Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.581025 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7144271a1672fd1921a36477a19b4f92598b492df5702dca40a01c21efb40c8d"} Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.582186 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8d6dd7aab701436ac34fc69ce4dc99865f8b2c6e95e819f58a21764aab5615dd"} Oct 06 14:53:24 crc kubenswrapper[4763]: I1006 14:53:24.583359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4af20bd22def5d692d70ab83230681b59a3af6075d1066a7980cdb2a9d1fdb8e"} Oct 06 14:53:24 crc kubenswrapper[4763]: W1006 14:53:24.779282 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:24 crc kubenswrapper[4763]: E1006 14:53:24.779387 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 06 14:53:24 crc kubenswrapper[4763]: E1006 14:53:24.906849 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="1.6s" Oct 06 14:53:25 crc kubenswrapper[4763]: W1006 14:53:25.146783 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:25 crc kubenswrapper[4763]: E1006 14:53:25.146892 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.149157 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.150517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.150554 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.150577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.150604 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 14:53:25 crc kubenswrapper[4763]: E1006 14:53:25.150972 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.495377 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.588797 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7" exitCode=0 Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.588979 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.589016 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7"} Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.590351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.590399 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.590421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.591876 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="160fa431da4b4a7661b2c6514ac1220b1757c56b8ee4a154b2d311829075154a" exitCode=0 Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.591988 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"160fa431da4b4a7661b2c6514ac1220b1757c56b8ee4a154b2d311829075154a"} Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.592172 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.593637 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.593671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.593686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.594076 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.595253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.595321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.595342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.595855 4763 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a2c13b15d452e564c88a9934cc87bbd0ff363c03df8e12bfb48b3761a89e8bfd" exitCode=0 Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.595953 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.595994 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a2c13b15d452e564c88a9934cc87bbd0ff363c03df8e12bfb48b3761a89e8bfd"} Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.597515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.597553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.597572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.606339 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d"} Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.606395 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f"} Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.606411 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf"} Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.606423 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76"} Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.607034 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.611190 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.611233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.611247 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.612826 4763 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1" exitCode=0 Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.612886 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1"} Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.612913 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.614250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.614287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:25 crc kubenswrapper[4763]: I1006 14:53:25.614299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.072646 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.081246 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.210346 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:26 crc kubenswrapper[4763]: W1006 14:53:26.339682 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:26 crc kubenswrapper[4763]: E1006 14:53:26.339820 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.495185 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:26 crc kubenswrapper[4763]: E1006 14:53:26.508089 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="3.2s" Oct 06 14:53:26 crc kubenswrapper[4763]: W1006 14:53:26.538135 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 06 14:53:26 crc kubenswrapper[4763]: E1006 14:53:26.538255 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.622770 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539"} Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.622844 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573"} Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.622857 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1"} Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.622852 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.624420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.624492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.624507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.628464 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8"} Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.628516 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b"} Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.628534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6"} Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.628552 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd"} Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.631056 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7d7209d06da60a2b863feab92d001c5868e3699cdb2a231e05f0f3ea69bb3d90" exitCode=0 Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.631137 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7d7209d06da60a2b863feab92d001c5868e3699cdb2a231e05f0f3ea69bb3d90"} Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.631295 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.632678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.632730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.632747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.634425 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9f1f4c0f9470910dd15e4b6e688f798068848b21eef74b55a3c77eeaa651a9bf"} Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.634491 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.634539 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.638263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.638350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.638367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.639405 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.639436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.639461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.751288 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.752487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.752537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.752552 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:26 crc kubenswrapper[4763]: I1006 14:53:26.752585 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 14:53:26 crc kubenswrapper[4763]: E1006 14:53:26.753159 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.640362 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8"} Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.640468 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.641911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.641961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.641979 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.645590 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="35fa29beb8e043e7c1313f377f231cfeb3620907db0dcbd1800e9fbedd9bc472" exitCode=0 Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.645827 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.646882 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.647560 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"35fa29beb8e043e7c1313f377f231cfeb3620907db0dcbd1800e9fbedd9bc472"} Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.647659 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.647736 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.648337 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.648392 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.649690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.649745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.649768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.650812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.650866 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.650888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.651948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.651995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.652016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.652851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.652899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.653189 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:27 crc kubenswrapper[4763]: I1006 14:53:27.739087 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.213837 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.652288 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.652306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"60b1af2409c21814eea65f9ba77b0bdb847f5a486730b256f4402ebd0a2712f4"} Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.652338 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"be1d43c78441b65d1af4af0db01b8876d6566b13dc920063824d2cbde502b517"} Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.652356 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.652362 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f5250c78d81b7d43ebb86616ab5782a534cf41cbfd9cbd66a841e391713ef511"} Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.652371 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"25540c5b5badb96e8043e2c3d33056966b0249542aced4bb813661d0241e0975"} Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.652379 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.652412 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.653639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.653669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.653677 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.654600 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.654684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.654702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.654828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.654881 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:28 crc kubenswrapper[4763]: I1006 14:53:28.654905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.211504 4763 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.211661 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.662287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"03662ece0be8570fbf574ed468090386380ba97ad909fab312b21b505d4dafc1"} Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.662339 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.662403 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.662407 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.663697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.663726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.663735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.664345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.664378 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.664388 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.953925 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.955576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.955669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.955681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:29 crc kubenswrapper[4763]: I1006 14:53:29.955720 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 14:53:30 crc kubenswrapper[4763]: I1006 14:53:30.665983 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:30 crc kubenswrapper[4763]: I1006 14:53:30.667965 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:30 crc kubenswrapper[4763]: I1006 14:53:30.668011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:30 crc kubenswrapper[4763]: I1006 14:53:30.668028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:30 crc kubenswrapper[4763]: I1006 14:53:30.852715 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:30 crc kubenswrapper[4763]: I1006 14:53:30.852897 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 14:53:30 crc kubenswrapper[4763]: I1006 14:53:30.852953 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:30 crc kubenswrapper[4763]: I1006 14:53:30.854793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:30 crc kubenswrapper[4763]: I1006 14:53:30.854890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:30 crc kubenswrapper[4763]: I1006 14:53:30.854910 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:31 crc kubenswrapper[4763]: I1006 14:53:31.319327 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 06 14:53:31 crc kubenswrapper[4763]: I1006 14:53:31.669676 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:31 crc kubenswrapper[4763]: I1006 14:53:31.671435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:31 crc kubenswrapper[4763]: I1006 14:53:31.671513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:31 crc kubenswrapper[4763]: I1006 14:53:31.671540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:31 crc kubenswrapper[4763]: I1006 14:53:31.969137 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:31 crc kubenswrapper[4763]: I1006 14:53:31.969389 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:31 crc kubenswrapper[4763]: I1006 14:53:31.970816 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:31 crc kubenswrapper[4763]: I1006 14:53:31.970862 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:31 crc kubenswrapper[4763]: I1006 14:53:31.970877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:33 crc kubenswrapper[4763]: E1006 14:53:33.646061 4763 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 14:53:35 crc kubenswrapper[4763]: I1006 14:53:35.182207 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:35 crc kubenswrapper[4763]: I1006 14:53:35.182426 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:35 crc kubenswrapper[4763]: I1006 14:53:35.183867 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:35 crc kubenswrapper[4763]: I1006 14:53:35.183919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:35 crc kubenswrapper[4763]: I1006 14:53:35.183941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:35 crc kubenswrapper[4763]: I1006 14:53:35.187453 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:35 crc kubenswrapper[4763]: I1006 14:53:35.679207 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:35 crc kubenswrapper[4763]: I1006 14:53:35.680677 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:35 crc kubenswrapper[4763]: I1006 14:53:35.680742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:35 crc kubenswrapper[4763]: I1006 14:53:35.680759 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:37 crc kubenswrapper[4763]: W1006 14:53:37.230208 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 06 14:53:37 crc kubenswrapper[4763]: I1006 14:53:37.230277 4763 trace.go:236] Trace[1841587501]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 14:53:27.228) (total time: 10001ms): Oct 06 14:53:37 crc kubenswrapper[4763]: Trace[1841587501]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:53:37.230) Oct 06 14:53:37 crc kubenswrapper[4763]: Trace[1841587501]: [10.001525728s] [10.001525728s] END Oct 06 14:53:37 crc kubenswrapper[4763]: E1006 14:53:37.230293 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 06 14:53:37 crc kubenswrapper[4763]: I1006 14:53:37.496695 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 06 14:53:38 crc kubenswrapper[4763]: W1006 14:53:38.024078 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.024239 4763 trace.go:236] Trace[524397145]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 14:53:28.022) (total time: 10001ms): Oct 06 14:53:38 crc kubenswrapper[4763]: Trace[524397145]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:53:38.024) Oct 06 14:53:38 crc kubenswrapper[4763]: Trace[524397145]: [10.001997941s] [10.001997941s] END Oct 06 14:53:38 crc kubenswrapper[4763]: E1006 14:53:38.024275 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.660756 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.660967 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.661987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.662035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.662053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.698264 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.698475 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.699585 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.699645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.699661 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.726083 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.880190 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.880291 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.891286 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 06 14:53:38 crc kubenswrapper[4763]: I1006 14:53:38.891456 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 14:53:39 crc kubenswrapper[4763]: I1006 14:53:39.211218 4763 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 14:53:39 crc kubenswrapper[4763]: I1006 14:53:39.211327 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 14:53:39 crc kubenswrapper[4763]: I1006 14:53:39.689606 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:39 crc kubenswrapper[4763]: I1006 14:53:39.690761 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:39 crc kubenswrapper[4763]: I1006 14:53:39.690794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:39 crc kubenswrapper[4763]: I1006 14:53:39.690806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:40 crc kubenswrapper[4763]: I1006 14:53:40.860041 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:40 crc kubenswrapper[4763]: I1006 14:53:40.860280 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:40 crc kubenswrapper[4763]: I1006 14:53:40.861960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:40 crc kubenswrapper[4763]: I1006 14:53:40.862002 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:40 crc kubenswrapper[4763]: I1006 14:53:40.862018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:40 crc kubenswrapper[4763]: I1006 14:53:40.866015 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:41 crc kubenswrapper[4763]: I1006 14:53:41.107485 4763 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 06 14:53:41 crc kubenswrapper[4763]: I1006 14:53:41.694741 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 14:53:41 crc kubenswrapper[4763]: I1006 14:53:41.700036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:41 crc kubenswrapper[4763]: I1006 14:53:41.700115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:41 crc kubenswrapper[4763]: I1006 14:53:41.700151 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:42 crc kubenswrapper[4763]: I1006 14:53:42.878120 4763 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 06 14:53:43 crc kubenswrapper[4763]: E1006 14:53:43.646259 4763 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 14:53:43 crc kubenswrapper[4763]: E1006 14:53:43.882374 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.907450 4763 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.907596 4763 trace.go:236] Trace[361851039]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 14:53:32.257) (total time: 11650ms): Oct 06 14:53:43 crc kubenswrapper[4763]: Trace[361851039]: ---"Objects listed" error: 11650ms (14:53:43.907) Oct 06 14:53:43 crc kubenswrapper[4763]: Trace[361851039]: [11.650421208s] [11.650421208s] END Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.907670 4763 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.908031 4763 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.908065 4763 trace.go:236] Trace[357838990]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 14:53:31.713) (total time: 12194ms): Oct 06 14:53:43 crc kubenswrapper[4763]: Trace[357838990]: ---"Objects listed" error: 12194ms (14:53:43.907) Oct 06 14:53:43 crc kubenswrapper[4763]: Trace[357838990]: [12.194421269s] [12.194421269s] END Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.908095 4763 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.908300 4763 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.910058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.910107 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.910124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.910148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.910198 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:43Z","lastTransitionTime":"2025-10-06T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:43 crc kubenswrapper[4763]: E1006 14:53:43.930748 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.936146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.936212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.936231 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.936256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.936274 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:43Z","lastTransitionTime":"2025-10-06T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.942783 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45454->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.942849 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45454->192.168.126.11:17697: read: connection reset by peer" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.942854 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45470->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.942938 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45470->192.168.126.11:17697: read: connection reset by peer" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.943256 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.943295 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.943716 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.943790 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 14:53:43 crc kubenswrapper[4763]: E1006 14:53:43.952468 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.957049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.957123 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.957141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.957166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.957183 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:43Z","lastTransitionTime":"2025-10-06T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:43 crc kubenswrapper[4763]: E1006 14:53:43.974371 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.979636 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.979678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.979687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.979704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.979715 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:43Z","lastTransitionTime":"2025-10-06T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:43 crc kubenswrapper[4763]: E1006 14:53:43.991389 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.996303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.996340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.996347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.996364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:43 crc kubenswrapper[4763]: I1006 14:53:43.996375 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:43Z","lastTransitionTime":"2025-10-06T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.010226 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.010511 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.012566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.012642 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.012669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.012695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.012707 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:44Z","lastTransitionTime":"2025-10-06T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.114743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.114786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.114796 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.114814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.114825 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:44Z","lastTransitionTime":"2025-10-06T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.217852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.218300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.218533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.218850 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.219064 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:44Z","lastTransitionTime":"2025-10-06T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.322797 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.322897 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.322916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.322944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.322965 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:44Z","lastTransitionTime":"2025-10-06T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.426647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.426716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.426726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.426744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.426757 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:44Z","lastTransitionTime":"2025-10-06T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.489361 4763 apiserver.go:52] "Watching apiserver" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.493753 4763 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.493989 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.494490 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.494589 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.494697 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.494704 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.494767 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.494865 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.494883 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.495119 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.495201 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.497998 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.498272 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.498431 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.499210 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.499833 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.499920 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.500205 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.501864 4763 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.502225 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.502631 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.511994 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512055 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512093 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512126 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512159 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512191 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512222 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512256 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512291 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512322 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512355 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512386 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512415 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512447 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512478 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512488 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512509 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512589 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512620 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512665 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512637 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512693 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512788 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512832 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512871 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512907 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512940 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512965 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.512972 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513000 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513088 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513417 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513452 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513476 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513520 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513544 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513565 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513587 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513607 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513631 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513814 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513840 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513860 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513881 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513922 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513943 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514063 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514738 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514771 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514792 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514814 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514856 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514876 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514900 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514922 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514950 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514974 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514996 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515019 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515040 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515066 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515088 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515109 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515133 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515155 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515179 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515200 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513471 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513508 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513546 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513733 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513819 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513878 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513945 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.513963 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514165 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514239 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514239 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514262 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514791 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514915 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.514931 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515122 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.515232 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:53:45.015207212 +0000 UTC m=+22.170499844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.516959 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.516993 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517017 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517041 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517065 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517087 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517108 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517128 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515905 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517171 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517193 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517219 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517233 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517241 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517381 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517416 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517442 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517470 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517494 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517520 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517556 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517582 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517605 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517631 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517678 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517703 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517725 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517747 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517770 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517795 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517824 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517849 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517874 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517900 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517926 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517972 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.517995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518019 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518406 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518435 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518457 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518480 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518503 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518529 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518566 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518589 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518616 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518639 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518683 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518707 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518730 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518752 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518776 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518801 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518825 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518847 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518869 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518894 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518917 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518941 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518967 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.518991 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519013 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519035 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519060 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519084 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519107 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519131 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519155 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519185 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519211 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519236 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519260 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519285 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519310 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519334 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519359 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519387 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519411 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519437 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519468 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519499 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519542 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519568 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519594 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519627 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519703 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519730 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519753 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519777 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519802 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519825 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519850 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519875 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519901 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519927 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519968 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519993 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520017 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520042 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520067 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520093 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520119 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520143 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520167 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520192 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520217 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520241 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520264 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520290 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520315 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520340 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520366 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520391 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520416 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520446 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520472 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520501 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520526 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520572 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520596 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520620 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520645 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520686 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520709 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520733 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520755 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.520779 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.536488 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.536529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.536541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.536560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.536572 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:44Z","lastTransitionTime":"2025-10-06T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.549373 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551087 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519154 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551244 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551278 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.550184 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519254 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515480 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515573 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.516014 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551438 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551585 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.516309 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.516652 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551657 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.516727 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.516860 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551716 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551758 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551787 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551822 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551881 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.516930 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519353 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.519534 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.515323 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.521912 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.522082 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.552836 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.522993 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.523271 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.524252 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.524368 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.524727 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.524722 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.524895 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.525218 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.525326 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.525368 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.525427 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.524310 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551950 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553049 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553109 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553146 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553164 4763 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553281 4763 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553309 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553330 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553353 4763 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553371 4763 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553389 4763 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.525803 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.525988 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.526276 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.526439 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.526530 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.526655 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.526714 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.527255 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.527511 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.527620 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.528588 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.529099 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.529128 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.529310 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.529328 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.529377 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.529457 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.529645 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.529991 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.530078 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.530584 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.530853 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.531286 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.531744 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.531788 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.534568 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.535136 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.536116 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.537064 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.537091 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.537406 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.537527 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.537931 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.538140 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.538239 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.538468 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.538491 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.538585 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.538595 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.538841 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.539019 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.539100 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.539154 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.539371 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.539398 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.539454 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.539506 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.539610 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.539831 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.540065 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.540325 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.540749 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.541938 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.542109 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.542127 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.542250 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.542474 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.542560 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.542941 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.542954 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.543279 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.547143 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.547333 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.547444 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.547876 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.548015 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.548050 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.548287 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.548310 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.548697 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.549003 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.549027 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.549239 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.549327 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.549800 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.550309 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.550668 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.550682 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.550765 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.550804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551008 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551064 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551121 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551580 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.551594 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.551851 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.551946 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.554785 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553726 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.553406 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.555751 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:45.054075509 +0000 UTC m=+22.209368131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.555818 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:45.055804999 +0000 UTC m=+22.211097611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557732 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557770 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557787 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557823 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557836 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557849 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557861 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557897 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557910 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557921 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.557935 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558798 4763 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558822 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558835 4763 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558847 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558860 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558872 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558884 4763 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558897 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558909 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558921 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558933 4763 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558946 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558960 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558972 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558983 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558994 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.559008 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.559026 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.559037 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.558934 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.559051 4763 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.561969 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.562894 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.565714 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.566030 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.566135 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.566327 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.566421 4763 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.566498 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.566572 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.566649 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.566799 4763 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.566914 4763 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.567002 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.567078 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.567153 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.567230 4763 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.567313 4763 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.567424 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.567966 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.568341 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.569103 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.573786 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.573985 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.574197 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.575980 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.576086 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.576213 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.576242 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.576259 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.576284 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.576355 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:45.076320419 +0000 UTC m=+22.231612931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.576375 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.577167 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.577323 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.577387 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.579958 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.580046 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.581207 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.581344 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.583142 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.583222 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.583398 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.583492 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.583606 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.585183 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.585690 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.586247 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.587540 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.587571 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.587585 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:44 crc kubenswrapper[4763]: E1006 14:53:44.587646 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:45.08762521 +0000 UTC m=+22.242917822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.590222 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.590595 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.591758 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.591839 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.592426 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.592925 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.593939 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.596889 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.597697 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.599104 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.599342 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.599370 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.599456 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.599491 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.599591 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.599966 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.600648 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.601486 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.601920 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.602453 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.603998 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.604135 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.604902 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.604983 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.605034 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.605397 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.605568 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.605822 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.605851 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.606067 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.606231 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.606980 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.607037 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.619175 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.629165 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.634985 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.644017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.644090 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.644103 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.644121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.644133 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:44Z","lastTransitionTime":"2025-10-06T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.649485 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.671939 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672281 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672400 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672447 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672459 4763 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672468 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672476 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672484 4763 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672492 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672501 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672508 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672517 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672526 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672535 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672544 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672553 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672560 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672568 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672576 4763 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672584 4763 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672592 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672600 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672609 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672620 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672629 4763 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672638 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672646 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672669 4763 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672678 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672685 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672693 4763 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672701 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672709 4763 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672716 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672725 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672732 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672740 4763 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672748 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672755 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672763 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672772 4763 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672780 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672789 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672798 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672806 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672815 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672823 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672831 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672839 4763 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672846 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672854 4763 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672862 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672871 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672879 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672888 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672896 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672904 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672913 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672921 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672929 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672937 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672945 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672953 4763 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672960 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672968 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672976 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672985 4763 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.672993 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673001 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673009 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673016 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673025 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673034 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673041 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673050 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673057 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673065 4763 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673073 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673080 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673088 4763 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673095 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673103 4763 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673110 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673118 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673126 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673134 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673142 4763 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673150 4763 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673158 4763 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673165 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673173 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673182 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673190 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673198 4763 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673206 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673213 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673221 4763 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673229 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673237 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673245 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673252 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673260 4763 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673267 4763 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673275 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673282 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673290 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673299 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673307 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673315 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673323 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673330 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673338 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673346 4763 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673355 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673362 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673370 4763 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673378 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673387 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673395 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673402 4763 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673410 4763 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673418 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673425 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673433 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673441 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673448 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673456 4763 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673465 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673473 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673480 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673497 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673514 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673521 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673529 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673537 4763 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673544 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673552 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673561 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673569 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673577 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673584 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673593 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673602 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673610 4763 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673678 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.673900 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.686897 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.696977 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.706432 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.707650 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8" exitCode=255 Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.707756 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8"} Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.722976 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.735171 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.745282 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.746566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.746614 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.746635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.746652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.746662 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:44Z","lastTransitionTime":"2025-10-06T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.756247 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.773181 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.783371 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.794486 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.803251 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.811218 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.811546 4763 scope.go:117] "RemoveContainer" containerID="4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.824688 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.837192 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.851231 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.853396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.853429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.853440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.853455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.853466 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:44Z","lastTransitionTime":"2025-10-06T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:44 crc kubenswrapper[4763]: W1006 14:53:44.857036 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-a70a532066d4038c2db57d8dfaf671d9eb925d93387527940a8e1595f924ac27 WatchSource:0}: Error finding container a70a532066d4038c2db57d8dfaf671d9eb925d93387527940a8e1595f924ac27: Status 404 returned error can't find the container with id a70a532066d4038c2db57d8dfaf671d9eb925d93387527940a8e1595f924ac27 Oct 06 14:53:44 crc kubenswrapper[4763]: W1006 14:53:44.864600 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-92b875a81002fcc9c98956f05fc3fa9d320b0837b49f068a545bdeda18377c03 WatchSource:0}: Error finding container 92b875a81002fcc9c98956f05fc3fa9d320b0837b49f068a545bdeda18377c03: Status 404 returned error can't find the container with id 92b875a81002fcc9c98956f05fc3fa9d320b0837b49f068a545bdeda18377c03 Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.957460 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.957491 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.957500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.957516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:44 crc kubenswrapper[4763]: I1006 14:53:44.957524 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:44Z","lastTransitionTime":"2025-10-06T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.059367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.059410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.059421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.059439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.059452 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:45Z","lastTransitionTime":"2025-10-06T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.077475 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.077535 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.077561 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.077590 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.077700 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.077712 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:53:46.077687041 +0000 UTC m=+23.232979553 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.077752 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:46.077739293 +0000 UTC m=+23.233031805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.077750 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.077782 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.077776 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.077795 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.077863 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:46.077845446 +0000 UTC m=+23.233137958 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.077879 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:46.077873747 +0000 UTC m=+23.233166259 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.162143 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.162175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.162205 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.162221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.162231 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:45Z","lastTransitionTime":"2025-10-06T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.178379 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.178566 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.178602 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.178636 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.178688 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:46.178671364 +0000 UTC m=+23.333963886 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.264086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.264118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.264144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.264160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.264168 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:45Z","lastTransitionTime":"2025-10-06T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.365896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.365935 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.365944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.365959 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.365970 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:45Z","lastTransitionTime":"2025-10-06T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.468108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.468389 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.468485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.468577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.468689 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:45Z","lastTransitionTime":"2025-10-06T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.571184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.571542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.571628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.571734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.571828 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:45Z","lastTransitionTime":"2025-10-06T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.574210 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:45 crc kubenswrapper[4763]: E1006 14:53:45.574433 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.578308 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.579195 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.580146 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.580962 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.581778 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.582497 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.583341 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.584122 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.585024 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.585733 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.586446 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.590144 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.590890 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.593148 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.593638 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.594480 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.595055 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.595445 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.596401 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.596961 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.597394 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.600083 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.600493 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.601471 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.601945 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.602928 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.603553 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.604394 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.605049 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.608318 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.608858 4763 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.608956 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.610532 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.611459 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.611856 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.613437 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.614390 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.614905 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.615870 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.616617 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.617618 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.618213 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.619203 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.619864 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.620731 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.621244 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.622065 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.622760 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.623574 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.624129 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.625005 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.625533 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.626066 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.626958 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.673627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.674061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.674127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.674198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.674258 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:45Z","lastTransitionTime":"2025-10-06T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.710569 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.710609 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.710624 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"92b875a81002fcc9c98956f05fc3fa9d320b0837b49f068a545bdeda18377c03"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.711878 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a70a532066d4038c2db57d8dfaf671d9eb925d93387527940a8e1595f924ac27"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.712884 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.712910 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b3b1ce059c2b61a5209b79434dff247a813f6f8dcfaa3fb04b12785c326d45f8"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.714300 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.715817 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.716298 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.731740 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.745765 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.759789 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.775522 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.776357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.776401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.776414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.776433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.776445 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:45Z","lastTransitionTime":"2025-10-06T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.786862 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.800595 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.810072 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gqz6f"] Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.810415 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bj6z5"] Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.810601 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gqz6f" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.810676 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.818233 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.818381 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.818441 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.818629 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.818675 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.819388 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.820706 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.822869 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.825460 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.837118 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.850209 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.858468 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.869436 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.878720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.878744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.878752 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.878767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.878776 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:45Z","lastTransitionTime":"2025-10-06T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.881802 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.893503 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.903709 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.916821 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.927523 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.981095 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.981134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.981145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.981162 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.981175 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:45Z","lastTransitionTime":"2025-10-06T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984497 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-var-lib-cni-bin\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984523 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-var-lib-kubelet\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984538 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-daemon-config\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984555 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-run-k8s-cni-cncf-io\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984581 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-run-netns\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984595 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-run-multus-certs\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984614 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-etc-kubernetes\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984646 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7ac212b7-747d-41ba-9f91-e79c223fb17f-hosts-file\") pod \"node-resolver-gqz6f\" (UID: \"7ac212b7-747d-41ba-9f91-e79c223fb17f\") " pod="openshift-dns/node-resolver-gqz6f" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984710 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-system-cni-dir\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984756 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-os-release\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984774 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-conf-dir\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984796 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgrq\" (UniqueName: \"kubernetes.io/projected/7ac212b7-747d-41ba-9f91-e79c223fb17f-kube-api-access-csgrq\") pod \"node-resolver-gqz6f\" (UID: \"7ac212b7-747d-41ba-9f91-e79c223fb17f\") " pod="openshift-dns/node-resolver-gqz6f" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984815 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-cnibin\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984831 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-var-lib-cni-multus\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984917 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22f7ff70-c0ad-406d-aa9d-6824cb935c66-cni-binary-copy\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984948 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-socket-dir-parent\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984972 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-hostroot\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.984994 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b58zr\" (UniqueName: \"kubernetes.io/projected/22f7ff70-c0ad-406d-aa9d-6824cb935c66-kube-api-access-b58zr\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:45 crc kubenswrapper[4763]: I1006 14:53:45.985015 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-cni-dir\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.083467 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.083502 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.083511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.083526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.083537 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:46Z","lastTransitionTime":"2025-10-06T14:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.085816 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.085886 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-system-cni-dir\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.085915 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-os-release\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.085937 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-conf-dir\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.085957 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-etc-kubernetes\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.085981 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7ac212b7-747d-41ba-9f91-e79c223fb17f-hosts-file\") pod \"node-resolver-gqz6f\" (UID: \"7ac212b7-747d-41ba-9f91-e79c223fb17f\") " pod="openshift-dns/node-resolver-gqz6f" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086003 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csgrq\" (UniqueName: \"kubernetes.io/projected/7ac212b7-747d-41ba-9f91-e79c223fb17f-kube-api-access-csgrq\") pod \"node-resolver-gqz6f\" (UID: \"7ac212b7-747d-41ba-9f91-e79c223fb17f\") " pod="openshift-dns/node-resolver-gqz6f" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086045 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-cnibin\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086066 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086090 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086109 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22f7ff70-c0ad-406d-aa9d-6824cb935c66-cni-binary-copy\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086130 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-var-lib-cni-multus\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086155 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-socket-dir-parent\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086174 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-hostroot\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086194 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b58zr\" (UniqueName: \"kubernetes.io/projected/22f7ff70-c0ad-406d-aa9d-6824cb935c66-kube-api-access-b58zr\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086212 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-cni-dir\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086235 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-var-lib-cni-bin\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086256 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-var-lib-kubelet\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086276 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-daemon-config\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086299 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-run-k8s-cni-cncf-io\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086327 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-run-netns\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086347 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-run-multus-certs\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086410 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-run-multus-certs\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.086482 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:53:48.086466302 +0000 UTC m=+25.241758814 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086605 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-system-cni-dir\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086674 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-os-release\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086713 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-conf-dir\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086744 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-etc-kubernetes\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.086777 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7ac212b7-747d-41ba-9f91-e79c223fb17f-hosts-file\") pod \"node-resolver-gqz6f\" (UID: \"7ac212b7-747d-41ba-9f91-e79c223fb17f\") " pod="openshift-dns/node-resolver-gqz6f" Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.086831 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.086865 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:48.086856473 +0000 UTC m=+25.242148985 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.087264 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-cnibin\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.087329 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.087364 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:48.087352628 +0000 UTC m=+25.242645140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.087424 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.087442 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.087453 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.087480 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:48.087472001 +0000 UTC m=+25.242764513 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.088173 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22f7ff70-c0ad-406d-aa9d-6824cb935c66-cni-binary-copy\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.088226 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-var-lib-cni-multus\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.088333 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-socket-dir-parent\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.088372 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-hostroot\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.088623 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-cni-dir\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.088684 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-var-lib-cni-bin\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.088723 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-var-lib-kubelet\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.088832 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-run-k8s-cni-cncf-io\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.088911 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22f7ff70-c0ad-406d-aa9d-6824cb935c66-host-run-netns\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.089195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/22f7ff70-c0ad-406d-aa9d-6824cb935c66-multus-daemon-config\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.108981 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b58zr\" (UniqueName: \"kubernetes.io/projected/22f7ff70-c0ad-406d-aa9d-6824cb935c66-kube-api-access-b58zr\") pod \"multus-bj6z5\" (UID: \"22f7ff70-c0ad-406d-aa9d-6824cb935c66\") " pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.119466 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgrq\" (UniqueName: \"kubernetes.io/projected/7ac212b7-747d-41ba-9f91-e79c223fb17f-kube-api-access-csgrq\") pod \"node-resolver-gqz6f\" (UID: \"7ac212b7-747d-41ba-9f91-e79c223fb17f\") " pod="openshift-dns/node-resolver-gqz6f" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.122584 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gqz6f" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.128688 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bj6z5" Oct 06 14:53:46 crc kubenswrapper[4763]: W1006 14:53:46.153942 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ac212b7_747d_41ba_9f91_e79c223fb17f.slice/crio-a394e0106c3d840786bdc22c8e7cf337abd69fc57b9f8be4957bde50ac891700 WatchSource:0}: Error finding container a394e0106c3d840786bdc22c8e7cf337abd69fc57b9f8be4957bde50ac891700: Status 404 returned error can't find the container with id a394e0106c3d840786bdc22c8e7cf337abd69fc57b9f8be4957bde50ac891700 Oct 06 14:53:46 crc kubenswrapper[4763]: W1006 14:53:46.160809 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22f7ff70_c0ad_406d_aa9d_6824cb935c66.slice/crio-872c961a27493316b06c2b063de0846d6456adb14641c258f316b7eb3c69c68d WatchSource:0}: Error finding container 872c961a27493316b06c2b063de0846d6456adb14641c258f316b7eb3c69c68d: Status 404 returned error can't find the container with id 872c961a27493316b06c2b063de0846d6456adb14641c258f316b7eb3c69c68d Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.203665 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.203897 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.203912 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.203934 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.203954 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:48.203938557 +0000 UTC m=+25.359231069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.204454 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tzrc6"] Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.205379 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.208796 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.208840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.208853 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.208869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.208878 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:46Z","lastTransitionTime":"2025-10-06T14:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.208976 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-9g2sw"] Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.209011 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.209095 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.209296 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.213394 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.213588 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.213918 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.214014 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.214268 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.214285 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jnftg"] Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.215007 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.218946 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.220452 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.220672 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.220720 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.220731 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.220730 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.220675 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.222568 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.222930 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.233139 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.237140 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.254541 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.267026 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.284571 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.299010 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304548 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9f86111c-14e3-4725-b3cf-b62a3b711813-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304591 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-proxy-tls\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304615 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-os-release\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdw7z\" (UniqueName: \"kubernetes.io/projected/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-kube-api-access-hdw7z\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304673 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304694 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f86111c-14e3-4725-b3cf-b62a3b711813-cni-binary-copy\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304716 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-cnibin\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304739 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hqpk\" (UniqueName: \"kubernetes.io/projected/9f86111c-14e3-4725-b3cf-b62a3b711813-kube-api-access-6hqpk\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304759 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-mcd-auth-proxy-config\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304791 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-rootfs\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.304809 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-system-cni-dir\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.311170 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.311204 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.311215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.311232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.311244 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:46Z","lastTransitionTime":"2025-10-06T14:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.317966 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.337919 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.350788 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.367063 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.381395 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.397369 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405405 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405445 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9f86111c-14e3-4725-b3cf-b62a3b711813-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405465 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-netns\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405480 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-var-lib-openvswitch\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405495 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-slash\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405508 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-kubelet\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405524 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405537 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-config\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405549 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-env-overrides\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405565 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovn-node-metrics-cert\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljzf8\" (UniqueName: \"kubernetes.io/projected/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-kube-api-access-ljzf8\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405608 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-systemd\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405765 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-openvswitch\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405812 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-netd\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405833 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-proxy-tls\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405850 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-ovn\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405867 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-os-release\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405892 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdw7z\" (UniqueName: \"kubernetes.io/projected/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-kube-api-access-hdw7z\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405909 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f86111c-14e3-4725-b3cf-b62a3b711813-cni-binary-copy\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405925 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405966 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-cnibin\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405994 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405996 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hqpk\" (UniqueName: \"kubernetes.io/projected/9f86111c-14e3-4725-b3cf-b62a3b711813-kube-api-access-6hqpk\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.405972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-os-release\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406017 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-mcd-auth-proxy-config\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406038 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-log-socket\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406054 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-bin\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406112 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-script-lib\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406097 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-cnibin\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-etc-openvswitch\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406244 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-node-log\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406229 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9f86111c-14e3-4725-b3cf-b62a3b711813-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406294 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-system-cni-dir\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406336 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f86111c-14e3-4725-b3cf-b62a3b711813-system-cni-dir\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406369 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-rootfs\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406406 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-systemd-units\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406379 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-rootfs\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406486 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f86111c-14e3-4725-b3cf-b62a3b711813-cni-binary-copy\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.406525 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-mcd-auth-proxy-config\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.408894 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-proxy-tls\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.409327 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.414464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.414502 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.414511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.414528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.414539 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:46Z","lastTransitionTime":"2025-10-06T14:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.426042 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.426275 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdw7z\" (UniqueName: \"kubernetes.io/projected/4c91c0c6-f031-4840-bc66-ab38e8fb67c7-kube-api-access-hdw7z\") pod \"machine-config-daemon-9g2sw\" (UID: \"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\") " pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.429801 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hqpk\" (UniqueName: \"kubernetes.io/projected/9f86111c-14e3-4725-b3cf-b62a3b711813-kube-api-access-6hqpk\") pod \"multus-additional-cni-plugins-tzrc6\" (UID: \"9f86111c-14e3-4725-b3cf-b62a3b711813\") " pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.442169 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.460434 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.488085 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.501660 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507478 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-log-socket\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507518 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-script-lib\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507534 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-etc-openvswitch\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507548 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-node-log\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507563 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-bin\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507585 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-systemd-units\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507608 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507661 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-netns\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507680 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-slash\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507698 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-var-lib-openvswitch\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507719 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-kubelet\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507732 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507745 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-config\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507760 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-env-overrides\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507775 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovn-node-metrics-cert\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507789 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljzf8\" (UniqueName: \"kubernetes.io/projected/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-kube-api-access-ljzf8\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507804 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-systemd\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507826 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-openvswitch\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507842 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-ovn\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507863 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-netd\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507945 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-netd\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.507991 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-log-socket\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508733 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-script-lib\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508777 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-etc-openvswitch\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508801 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-node-log\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508823 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-bin\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508845 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-systemd-units\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508865 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508887 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-netns\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508906 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-slash\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508926 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-var-lib-openvswitch\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508949 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-kubelet\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.508969 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-ovn-kubernetes\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.509347 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-config\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.509386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-systemd\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.509409 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-openvswitch\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.509434 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-ovn\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.509874 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-env-overrides\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.513248 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovn-node-metrics-cert\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.516221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.516255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.516265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.516282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.516293 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:46Z","lastTransitionTime":"2025-10-06T14:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.521773 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.528246 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.528930 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljzf8\" (UniqueName: \"kubernetes.io/projected/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-kube-api-access-ljzf8\") pod \"ovnkube-node-jnftg\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.540076 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.552093 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.561188 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.561446 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:46 crc kubenswrapper[4763]: W1006 14:53:46.571447 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c91c0c6_f031_4840_bc66_ab38e8fb67c7.slice/crio-5ac563419d4075da982991fbda272a737b8fd2630c94822e280424b0544c3069 WatchSource:0}: Error finding container 5ac563419d4075da982991fbda272a737b8fd2630c94822e280424b0544c3069: Status 404 returned error can't find the container with id 5ac563419d4075da982991fbda272a737b8fd2630c94822e280424b0544c3069 Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.573947 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.574123 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.574186 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:46 crc kubenswrapper[4763]: E1006 14:53:46.574321 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:53:46 crc kubenswrapper[4763]: W1006 14:53:46.598054 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe8cb5e2_cf64_4ca2_a8b2_6044db6de7c7.slice/crio-bf5ae8835061bdd818e8ef4eef71805be6b7f26f6d1e80aa3541a244e616ce0c WatchSource:0}: Error finding container bf5ae8835061bdd818e8ef4eef71805be6b7f26f6d1e80aa3541a244e616ce0c: Status 404 returned error can't find the container with id bf5ae8835061bdd818e8ef4eef71805be6b7f26f6d1e80aa3541a244e616ce0c Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.608692 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.624963 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.630957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.630992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.631000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.631019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.631028 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:46Z","lastTransitionTime":"2025-10-06T14:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.659993 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.719787 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"5ac563419d4075da982991fbda272a737b8fd2630c94822e280424b0544c3069"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.724924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gqz6f" event={"ID":"7ac212b7-747d-41ba-9f91-e79c223fb17f","Type":"ContainerStarted","Data":"dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.725063 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gqz6f" event={"ID":"7ac212b7-747d-41ba-9f91-e79c223fb17f","Type":"ContainerStarted","Data":"a394e0106c3d840786bdc22c8e7cf337abd69fc57b9f8be4957bde50ac891700"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.727230 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"bf5ae8835061bdd818e8ef4eef71805be6b7f26f6d1e80aa3541a244e616ce0c"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.728164 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" event={"ID":"9f86111c-14e3-4725-b3cf-b62a3b711813","Type":"ContainerStarted","Data":"296c586fe7df91eb269ef818492fbdb51334ae3e97a28009598dc1ee8a7fcc1a"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.729899 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bj6z5" event={"ID":"22f7ff70-c0ad-406d-aa9d-6824cb935c66","Type":"ContainerStarted","Data":"6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.729924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bj6z5" event={"ID":"22f7ff70-c0ad-406d-aa9d-6824cb935c66","Type":"ContainerStarted","Data":"872c961a27493316b06c2b063de0846d6456adb14641c258f316b7eb3c69c68d"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.733821 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.733865 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.733879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.733917 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.733931 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:46Z","lastTransitionTime":"2025-10-06T14:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.742978 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.756047 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.772092 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.793200 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.812401 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.826261 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.837291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.837343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.837357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.837377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.837390 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:46Z","lastTransitionTime":"2025-10-06T14:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.838030 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.853289 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.867986 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.884989 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.898608 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.913435 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.935340 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.939109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.939170 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.939183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.939201 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.939216 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:46Z","lastTransitionTime":"2025-10-06T14:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.948054 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.958936 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.968469 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.980527 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:46 crc kubenswrapper[4763]: I1006 14:53:46.996213 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.011609 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.022256 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.036753 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.041239 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.041271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.041282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.041316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.041328 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:47Z","lastTransitionTime":"2025-10-06T14:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.049603 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.061070 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.074458 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.093205 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.106956 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.144403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.144455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.144466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.144485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.144497 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:47Z","lastTransitionTime":"2025-10-06T14:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.247146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.247294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.247316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.247380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.247402 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:47Z","lastTransitionTime":"2025-10-06T14:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.349942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.349986 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.349996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.350016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.350027 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:47Z","lastTransitionTime":"2025-10-06T14:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.452563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.452640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.452652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.452677 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.452688 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:47Z","lastTransitionTime":"2025-10-06T14:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.554737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.554781 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.554792 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.554811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.554822 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:47Z","lastTransitionTime":"2025-10-06T14:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.575129 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:47 crc kubenswrapper[4763]: E1006 14:53:47.575244 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.657610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.657677 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.657688 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.657705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.657714 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:47Z","lastTransitionTime":"2025-10-06T14:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.736296 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.736345 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.737697 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729" exitCode=0 Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.737778 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.739093 4763 generic.go:334] "Generic (PLEG): container finished" podID="9f86111c-14e3-4725-b3cf-b62a3b711813" containerID="6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9" exitCode=0 Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.739167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" event={"ID":"9f86111c-14e3-4725-b3cf-b62a3b711813","Type":"ContainerDied","Data":"6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.743743 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.749532 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.760348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.760666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.760681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.760702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.760714 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:47Z","lastTransitionTime":"2025-10-06T14:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.764122 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.778520 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.791579 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.804152 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.814539 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.830715 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.843606 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.858438 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.864142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.864175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.864183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.864199 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.864210 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:47Z","lastTransitionTime":"2025-10-06T14:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.872876 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.895219 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.909184 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.923192 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.936091 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.951399 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.966389 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.966420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.966428 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.966442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.966451 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:47Z","lastTransitionTime":"2025-10-06T14:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.966560 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:47 crc kubenswrapper[4763]: I1006 14:53:47.988438 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.001376 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:47Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.011596 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.025351 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.040562 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.052463 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.068439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.068468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.068479 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.068496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.068509 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:48Z","lastTransitionTime":"2025-10-06T14:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.077690 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.093176 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.113751 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.125666 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.137005 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.137134 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.137154 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:53:52.13712945 +0000 UTC m=+29.292421952 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.137182 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.137213 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.137299 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.137326 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.137344 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:52.137334086 +0000 UTC m=+29.292626598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.137377 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:52.137361927 +0000 UTC m=+29.292654449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.137443 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.137482 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.137501 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.137587 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:52.137557242 +0000 UTC m=+29.292849964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.171768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.172146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.172157 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.172174 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.172184 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:48Z","lastTransitionTime":"2025-10-06T14:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.237720 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.237920 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.237945 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.237959 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.238024 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 14:53:52.23800801 +0000 UTC m=+29.393300532 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.275334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.275362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.275371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.275386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.275396 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:48Z","lastTransitionTime":"2025-10-06T14:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.378831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.378892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.378904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.378927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.378943 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:48Z","lastTransitionTime":"2025-10-06T14:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.481256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.481292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.481303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.481320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.481330 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:48Z","lastTransitionTime":"2025-10-06T14:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.574796 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.574803 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.574921 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:53:48 crc kubenswrapper[4763]: E1006 14:53:48.575031 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.583067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.583099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.583110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.583123 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.583133 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:48Z","lastTransitionTime":"2025-10-06T14:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.685571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.685642 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.685654 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.685678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.685691 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:48Z","lastTransitionTime":"2025-10-06T14:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.748058 4763 generic.go:334] "Generic (PLEG): container finished" podID="9f86111c-14e3-4725-b3cf-b62a3b711813" containerID="adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025" exitCode=0 Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.748132 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" event={"ID":"9f86111c-14e3-4725-b3cf-b62a3b711813","Type":"ContainerDied","Data":"adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.753128 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.753671 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.753708 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.753735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.753763 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.753789 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.762855 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.780310 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.789338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.789410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.789426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.789454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.789476 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:48Z","lastTransitionTime":"2025-10-06T14:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.795382 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.808930 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.827177 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.841082 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.859499 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.875720 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.888586 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.891992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.892045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.892053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.892068 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.892077 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:48Z","lastTransitionTime":"2025-10-06T14:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.909532 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.924517 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.942475 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.956666 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:48Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.995511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.995571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.995588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.995621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:48 crc kubenswrapper[4763]: I1006 14:53:48.995675 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:48Z","lastTransitionTime":"2025-10-06T14:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.098687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.098723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.098731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.098745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.098755 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:49Z","lastTransitionTime":"2025-10-06T14:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.201139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.201195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.201213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.201238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.201311 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:49Z","lastTransitionTime":"2025-10-06T14:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.305091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.305124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.305133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.305148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.305158 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:49Z","lastTransitionTime":"2025-10-06T14:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.408224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.408289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.408308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.408337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.408356 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:49Z","lastTransitionTime":"2025-10-06T14:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.512426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.512483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.512496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.512516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.512532 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:49Z","lastTransitionTime":"2025-10-06T14:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.574835 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:49 crc kubenswrapper[4763]: E1006 14:53:49.575033 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.616574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.616736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.616749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.616770 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.616783 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:49Z","lastTransitionTime":"2025-10-06T14:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.720255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.720338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.720351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.720374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.720389 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:49Z","lastTransitionTime":"2025-10-06T14:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.759698 4763 generic.go:334] "Generic (PLEG): container finished" podID="9f86111c-14e3-4725-b3cf-b62a3b711813" containerID="a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf" exitCode=0 Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.759795 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" event={"ID":"9f86111c-14e3-4725-b3cf-b62a3b711813","Type":"ContainerDied","Data":"a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.777986 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.794375 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.812773 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.825012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.825069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.825081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.825104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.825118 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:49Z","lastTransitionTime":"2025-10-06T14:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.837600 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.856989 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.876209 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.888844 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.904551 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.915675 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.928599 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.928680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.928693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.928715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.928728 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:49Z","lastTransitionTime":"2025-10-06T14:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.929245 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.945924 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.958283 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.975218 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.984099 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-j9x9s"] Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.984508 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.987429 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.987445 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.987703 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 06 14:53:49 crc kubenswrapper[4763]: I1006 14:53:49.987864 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.000660 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:49Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.017759 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.036518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.036581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.036599 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.036628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.036665 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:50Z","lastTransitionTime":"2025-10-06T14:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.039796 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.053132 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.053400 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a002372a-0206-4c41-9e46-0491543d1d8e-serviceca\") pod \"node-ca-j9x9s\" (UID: \"a002372a-0206-4c41-9e46-0491543d1d8e\") " pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.053450 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a002372a-0206-4c41-9e46-0491543d1d8e-host\") pod \"node-ca-j9x9s\" (UID: \"a002372a-0206-4c41-9e46-0491543d1d8e\") " pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.053512 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqgr6\" (UniqueName: \"kubernetes.io/projected/a002372a-0206-4c41-9e46-0491543d1d8e-kube-api-access-cqgr6\") pod \"node-ca-j9x9s\" (UID: \"a002372a-0206-4c41-9e46-0491543d1d8e\") " pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.063556 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.076375 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.092686 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.109320 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.130408 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.140376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.140433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.140443 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.140464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.140474 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:50Z","lastTransitionTime":"2025-10-06T14:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.143194 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.154516 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a002372a-0206-4c41-9e46-0491543d1d8e-host\") pod \"node-ca-j9x9s\" (UID: \"a002372a-0206-4c41-9e46-0491543d1d8e\") " pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.154603 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgr6\" (UniqueName: \"kubernetes.io/projected/a002372a-0206-4c41-9e46-0491543d1d8e-kube-api-access-cqgr6\") pod \"node-ca-j9x9s\" (UID: \"a002372a-0206-4c41-9e46-0491543d1d8e\") " pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.154607 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a002372a-0206-4c41-9e46-0491543d1d8e-host\") pod \"node-ca-j9x9s\" (UID: \"a002372a-0206-4c41-9e46-0491543d1d8e\") " pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.154691 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a002372a-0206-4c41-9e46-0491543d1d8e-serviceca\") pod \"node-ca-j9x9s\" (UID: \"a002372a-0206-4c41-9e46-0491543d1d8e\") " pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.156182 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a002372a-0206-4c41-9e46-0491543d1d8e-serviceca\") pod \"node-ca-j9x9s\" (UID: \"a002372a-0206-4c41-9e46-0491543d1d8e\") " pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.159479 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.176606 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.178756 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqgr6\" (UniqueName: \"kubernetes.io/projected/a002372a-0206-4c41-9e46-0491543d1d8e-kube-api-access-cqgr6\") pod \"node-ca-j9x9s\" (UID: \"a002372a-0206-4c41-9e46-0491543d1d8e\") " pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.194247 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.215898 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.242976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.243039 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.243053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.243076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.243090 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:50Z","lastTransitionTime":"2025-10-06T14:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.299842 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j9x9s" Oct 06 14:53:50 crc kubenswrapper[4763]: W1006 14:53:50.310664 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda002372a_0206_4c41_9e46_0491543d1d8e.slice/crio-6850938fb9027b71c001115c3cdb0112b5f820fdba2aa43256d2883f78c7de1d WatchSource:0}: Error finding container 6850938fb9027b71c001115c3cdb0112b5f820fdba2aa43256d2883f78c7de1d: Status 404 returned error can't find the container with id 6850938fb9027b71c001115c3cdb0112b5f820fdba2aa43256d2883f78c7de1d Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.345249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.345289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.345299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.345315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.345327 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:50Z","lastTransitionTime":"2025-10-06T14:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.448757 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.448798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.448807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.448825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.448836 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:50Z","lastTransitionTime":"2025-10-06T14:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.551784 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.551837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.551850 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.551869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.551882 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:50Z","lastTransitionTime":"2025-10-06T14:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.574816 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.574876 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:50 crc kubenswrapper[4763]: E1006 14:53:50.574948 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:53:50 crc kubenswrapper[4763]: E1006 14:53:50.575078 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.655589 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.655659 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.655671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.655689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.655705 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:50Z","lastTransitionTime":"2025-10-06T14:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.759739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.760177 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.760200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.760227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.760244 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:50Z","lastTransitionTime":"2025-10-06T14:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.766386 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j9x9s" event={"ID":"a002372a-0206-4c41-9e46-0491543d1d8e","Type":"ContainerStarted","Data":"e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.766483 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j9x9s" event={"ID":"a002372a-0206-4c41-9e46-0491543d1d8e","Type":"ContainerStarted","Data":"6850938fb9027b71c001115c3cdb0112b5f820fdba2aa43256d2883f78c7de1d"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.770676 4763 generic.go:334] "Generic (PLEG): container finished" podID="9f86111c-14e3-4725-b3cf-b62a3b711813" containerID="8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6" exitCode=0 Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.770756 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" event={"ID":"9f86111c-14e3-4725-b3cf-b62a3b711813","Type":"ContainerDied","Data":"8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.787912 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.803520 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.824458 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.845660 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.858714 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.863206 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.863279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.863290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.863327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.863340 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:50Z","lastTransitionTime":"2025-10-06T14:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.872705 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.892444 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.905475 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.919894 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.933405 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.949554 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.962035 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.965552 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.965602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.965618 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.965671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.965686 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:50Z","lastTransitionTime":"2025-10-06T14:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.975582 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:50 crc kubenswrapper[4763]: I1006 14:53:50.987989 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:50Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.001913 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.013275 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.030239 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.044150 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.056941 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.068715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.068768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.068778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.068805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.068817 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:51Z","lastTransitionTime":"2025-10-06T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.072907 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.101312 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.113094 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.126498 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.138717 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.152644 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.164775 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.171190 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.171224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.171234 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.171253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.171265 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:51Z","lastTransitionTime":"2025-10-06T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.178347 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.190682 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.273717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.273744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.273752 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.273768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.273778 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:51Z","lastTransitionTime":"2025-10-06T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.376472 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.376506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.376515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.376531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.376542 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:51Z","lastTransitionTime":"2025-10-06T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.478511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.478548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.478557 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.478574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.478583 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:51Z","lastTransitionTime":"2025-10-06T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.574410 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:51 crc kubenswrapper[4763]: E1006 14:53:51.574597 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.580705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.580735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.580745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.580759 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.580770 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:51Z","lastTransitionTime":"2025-10-06T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.682722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.682779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.682795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.682820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.682841 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:51Z","lastTransitionTime":"2025-10-06T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.779184 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.782653 4763 generic.go:334] "Generic (PLEG): container finished" podID="9f86111c-14e3-4725-b3cf-b62a3b711813" containerID="820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61" exitCode=0 Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.782711 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" event={"ID":"9f86111c-14e3-4725-b3cf-b62a3b711813","Type":"ContainerDied","Data":"820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.785250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.785314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.785412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.785535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.785562 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:51Z","lastTransitionTime":"2025-10-06T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.802220 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.826252 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.842790 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.860734 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.877107 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.888052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.888254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.888317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.888430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.888511 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:51Z","lastTransitionTime":"2025-10-06T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.890480 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.899796 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.916444 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.933166 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.947883 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.961507 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.976303 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.990822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.990896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.990907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.990925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.990937 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:51Z","lastTransitionTime":"2025-10-06T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:51 crc kubenswrapper[4763]: I1006 14:53:51.994788 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:51Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.009812 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.094008 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.094062 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.094079 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.094104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.094122 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:52Z","lastTransitionTime":"2025-10-06T14:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.173006 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.173128 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.173188 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.173275 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:54:00.17324063 +0000 UTC m=+37.328533172 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.173302 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.173276 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.173321 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.173334 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.173373 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:00.173357683 +0000 UTC m=+37.328650235 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.173401 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:00.173387364 +0000 UTC m=+37.328679906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.173413 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.173372 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.173518 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:00.173498908 +0000 UTC m=+37.328791430 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.196360 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.196400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.196412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.196431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.196443 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:52Z","lastTransitionTime":"2025-10-06T14:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.274216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.274445 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.274479 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.274493 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.274558 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:00.274541893 +0000 UTC m=+37.429834405 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.300040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.300096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.300108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.300125 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.300137 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:52Z","lastTransitionTime":"2025-10-06T14:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.403868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.403909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.403923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.403946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.403958 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:52Z","lastTransitionTime":"2025-10-06T14:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.507354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.507401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.507413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.507431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.507445 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:52Z","lastTransitionTime":"2025-10-06T14:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.574169 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.574214 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.574319 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:53:52 crc kubenswrapper[4763]: E1006 14:53:52.574530 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.611112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.611181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.611200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.611229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.611250 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:52Z","lastTransitionTime":"2025-10-06T14:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.714257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.714297 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.714308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.714326 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.714337 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:52Z","lastTransitionTime":"2025-10-06T14:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.788815 4763 generic.go:334] "Generic (PLEG): container finished" podID="9f86111c-14e3-4725-b3cf-b62a3b711813" containerID="6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb" exitCode=0 Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.788861 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" event={"ID":"9f86111c-14e3-4725-b3cf-b62a3b711813","Type":"ContainerDied","Data":"6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb"} Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.807355 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.816029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.816108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.816128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.816154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.816171 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:52Z","lastTransitionTime":"2025-10-06T14:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.825025 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.842174 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.855623 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.874810 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.892079 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.909132 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.918510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.918547 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.918564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.918584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.918598 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:52Z","lastTransitionTime":"2025-10-06T14:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.919841 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.929182 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.944062 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.956571 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:52 crc kubenswrapper[4763]: I1006 14:53:52.989938 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:52Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.026943 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.029793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.029826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.029834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.029849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.029859 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:53Z","lastTransitionTime":"2025-10-06T14:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.049562 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.132533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.132594 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.132604 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.132643 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.132658 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:53Z","lastTransitionTime":"2025-10-06T14:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.234961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.235016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.235032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.235056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.235069 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:53Z","lastTransitionTime":"2025-10-06T14:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.338194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.338255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.338269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.338292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.338304 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:53Z","lastTransitionTime":"2025-10-06T14:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.441511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.441566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.441581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.441602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.441636 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:53Z","lastTransitionTime":"2025-10-06T14:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.544766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.544818 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.544830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.544851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.544861 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:53Z","lastTransitionTime":"2025-10-06T14:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.574700 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:53 crc kubenswrapper[4763]: E1006 14:53:53.574923 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.594312 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.607513 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.626341 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.642138 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.647590 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.647645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.647656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.647675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.647691 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:53Z","lastTransitionTime":"2025-10-06T14:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.665776 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.683168 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.696191 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.714525 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.729123 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.744764 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.750342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.750408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.750421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.750445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.750464 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:53Z","lastTransitionTime":"2025-10-06T14:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.758879 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.776514 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.799405 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.799914 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.803944 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" event={"ID":"9f86111c-14e3-4725-b3cf-b62a3b711813","Type":"ContainerStarted","Data":"c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.807811 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.822965 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.834051 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.840348 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.853432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.853506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.853525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.853552 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.853570 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:53Z","lastTransitionTime":"2025-10-06T14:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.862755 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.878874 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.894241 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.914083 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.930594 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.954453 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.956498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.956688 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.956846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.956897 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.956919 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:53Z","lastTransitionTime":"2025-10-06T14:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.977744 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:53 crc kubenswrapper[4763]: I1006 14:53:53.997101 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:53Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.020932 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.036813 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.049266 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.049326 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.049345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.049370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.049387 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.056562 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: E1006 14:53:54.071301 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.077093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.077152 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.077172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.077197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.077214 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.078734 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: E1006 14:53:54.096145 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.100326 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.100954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.101011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.101028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.101052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.101070 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.119222 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: E1006 14:53:54.125013 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.130079 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.130140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.130159 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.130185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.130203 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.139317 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: E1006 14:53:54.147687 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.152354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.152415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.152432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.152458 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.152475 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.156106 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: E1006 14:53:54.166469 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: E1006 14:53:54.166698 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.168610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.168656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.168669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.168688 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.168698 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.169466 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.183726 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.197837 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.211888 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.235484 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.257989 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.268019 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.270676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.270717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.270730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.270748 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.270760 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.281642 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.295316 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.309845 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.322530 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.372989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.373032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.373047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.373065 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.373076 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.476062 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.476470 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.476491 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.476516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.476533 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.574801 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.574879 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:54 crc kubenswrapper[4763]: E1006 14:53:54.574967 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:53:54 crc kubenswrapper[4763]: E1006 14:53:54.575052 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.579222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.579252 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.579260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.579271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.579280 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.681677 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.681720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.681730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.681745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.681757 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.784294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.784354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.784372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.784396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.784413 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.808485 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.809683 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.841306 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.867293 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.887080 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.887134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.887152 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.887178 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.887196 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.889080 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.903871 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.919330 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.936468 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.953395 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.968029 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.990661 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.990752 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.990771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.990796 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.990815 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:54Z","lastTransitionTime":"2025-10-06T14:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:54 crc kubenswrapper[4763]: I1006 14:53:54.990888 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:54Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.011535 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:55Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.027340 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:55Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.050792 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:55Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.077333 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:55Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.093171 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:55Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.093263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.093422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.093434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.093451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.093462 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:55Z","lastTransitionTime":"2025-10-06T14:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.109939 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:55Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.197246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.197310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.197328 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.197356 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.197378 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:55Z","lastTransitionTime":"2025-10-06T14:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.305443 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.305515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.305546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.305577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.305600 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:55Z","lastTransitionTime":"2025-10-06T14:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.407782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.407829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.407840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.407859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.407869 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:55Z","lastTransitionTime":"2025-10-06T14:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.510419 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.510661 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.510728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.510793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.510864 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:55Z","lastTransitionTime":"2025-10-06T14:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.574697 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:55 crc kubenswrapper[4763]: E1006 14:53:55.575051 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.614048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.614092 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.614101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.614117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.614130 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:55Z","lastTransitionTime":"2025-10-06T14:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.717369 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.717471 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.717494 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.717522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.717541 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:55Z","lastTransitionTime":"2025-10-06T14:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.812101 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.820445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.820475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.820486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.820501 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.820513 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:55Z","lastTransitionTime":"2025-10-06T14:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.923237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.923436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.923466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.923555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:55 crc kubenswrapper[4763]: I1006 14:53:55.923647 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:55Z","lastTransitionTime":"2025-10-06T14:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.026285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.026342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.026359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.026385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.026405 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:56Z","lastTransitionTime":"2025-10-06T14:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.129181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.129254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.129280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.129308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.129329 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:56Z","lastTransitionTime":"2025-10-06T14:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.232710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.232755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.232767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.232786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.232799 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:56Z","lastTransitionTime":"2025-10-06T14:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.335967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.336047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.336072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.336101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.336118 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:56Z","lastTransitionTime":"2025-10-06T14:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.438925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.438988 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.439007 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.439040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.439064 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:56Z","lastTransitionTime":"2025-10-06T14:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.542104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.542144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.542158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.542176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.542189 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:56Z","lastTransitionTime":"2025-10-06T14:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.574889 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.574969 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:56 crc kubenswrapper[4763]: E1006 14:53:56.575115 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:53:56 crc kubenswrapper[4763]: E1006 14:53:56.575302 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.645367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.645435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.645451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.645475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.645494 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:56Z","lastTransitionTime":"2025-10-06T14:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.748692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.748755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.748773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.748799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.748819 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:56Z","lastTransitionTime":"2025-10-06T14:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.818548 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/0.log" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.822549 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b" exitCode=1 Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.822663 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.823888 4763 scope.go:117] "RemoveContainer" containerID="09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.844798 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:56Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.852015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.852074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.852092 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.852117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.852134 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:56Z","lastTransitionTime":"2025-10-06T14:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.867576 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:56Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.891135 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:56Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.908663 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:56Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.933223 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:56Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.956662 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.956704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.956721 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.956741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.956752 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:56Z","lastTransitionTime":"2025-10-06T14:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.958164 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:56Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:56 crc kubenswrapper[4763]: I1006 14:53:56.976258 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:56Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.001494 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:56Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.020236 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.041920 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.056919 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.060195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.060261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.060280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.060306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.060323 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:57Z","lastTransitionTime":"2025-10-06T14:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.070836 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.090106 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:55Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 14:53:55.933050 6105 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 14:53:55.933281 6105 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 14:53:55.933495 6105 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 14:53:55.933779 6105 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.933927 6105 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934026 6105 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934078 6105 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934586 6105 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.101242 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.163491 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.163532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.163544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.163561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.163574 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:57Z","lastTransitionTime":"2025-10-06T14:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.265949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.266002 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.266014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.266027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.266036 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:57Z","lastTransitionTime":"2025-10-06T14:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.369258 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.369313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.369329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.369357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.369377 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:57Z","lastTransitionTime":"2025-10-06T14:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.471678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.471721 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.471733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.471753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.471765 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:57Z","lastTransitionTime":"2025-10-06T14:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.573918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.573962 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.573978 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.573997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.574010 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:57Z","lastTransitionTime":"2025-10-06T14:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.574043 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:57 crc kubenswrapper[4763]: E1006 14:53:57.574147 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.677039 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.677105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.677121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.677148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.677163 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:57Z","lastTransitionTime":"2025-10-06T14:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.780283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.780348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.780365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.780391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.780408 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:57Z","lastTransitionTime":"2025-10-06T14:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.829675 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/0.log" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.834051 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.834152 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.848813 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.863831 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.883020 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.884481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.884528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.884546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.884568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.884580 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:57Z","lastTransitionTime":"2025-10-06T14:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.906399 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.929831 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.961674 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:55Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 14:53:55.933050 6105 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 14:53:55.933281 6105 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 14:53:55.933495 6105 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 14:53:55.933779 6105 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.933927 6105 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934026 6105 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934078 6105 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934586 6105 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.977104 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.987267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.987334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.987357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.987389 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.987416 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:57Z","lastTransitionTime":"2025-10-06T14:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:57 crc kubenswrapper[4763]: I1006 14:53:57.996306 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:57Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.018439 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.042948 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.059674 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.075658 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.088679 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.091365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.091422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.091440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.091467 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.091487 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:58Z","lastTransitionTime":"2025-10-06T14:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.107578 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.195058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.195102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.195111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.195131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.195141 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:58Z","lastTransitionTime":"2025-10-06T14:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.297345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.297413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.297436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.297466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.297490 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:58Z","lastTransitionTime":"2025-10-06T14:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.400070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.400126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.400142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.400164 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.400178 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:58Z","lastTransitionTime":"2025-10-06T14:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.503377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.503434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.503449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.503469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.503484 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:58Z","lastTransitionTime":"2025-10-06T14:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.574883 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.574901 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:53:58 crc kubenswrapper[4763]: E1006 14:53:58.575033 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:53:58 crc kubenswrapper[4763]: E1006 14:53:58.575130 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.606345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.606394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.606406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.606426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.606439 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:58Z","lastTransitionTime":"2025-10-06T14:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.609652 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds"] Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.610155 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.611978 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.612017 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.622956 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.635260 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.651170 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4tbd\" (UniqueName: \"kubernetes.io/projected/0a93bc27-2eb6-438b-bc9c-cab665f898f3-kube-api-access-j4tbd\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.651229 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a93bc27-2eb6-438b-bc9c-cab665f898f3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.651257 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a93bc27-2eb6-438b-bc9c-cab665f898f3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.651279 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a93bc27-2eb6-438b-bc9c-cab665f898f3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.651609 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.668110 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.684824 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.702166 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.708688 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.708741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.708753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.708774 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.708786 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:58Z","lastTransitionTime":"2025-10-06T14:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.735575 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:55Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 14:53:55.933050 6105 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 14:53:55.933281 6105 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 14:53:55.933495 6105 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 14:53:55.933779 6105 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.933927 6105 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934026 6105 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934078 6105 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934586 6105 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.748260 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.752450 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4tbd\" (UniqueName: \"kubernetes.io/projected/0a93bc27-2eb6-438b-bc9c-cab665f898f3-kube-api-access-j4tbd\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.752531 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a93bc27-2eb6-438b-bc9c-cab665f898f3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.752594 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a93bc27-2eb6-438b-bc9c-cab665f898f3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.752686 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a93bc27-2eb6-438b-bc9c-cab665f898f3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.753257 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a93bc27-2eb6-438b-bc9c-cab665f898f3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.753928 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a93bc27-2eb6-438b-bc9c-cab665f898f3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.758274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a93bc27-2eb6-438b-bc9c-cab665f898f3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.762736 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.778211 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4tbd\" (UniqueName: \"kubernetes.io/projected/0a93bc27-2eb6-438b-bc9c-cab665f898f3-kube-api-access-j4tbd\") pod \"ovnkube-control-plane-749d76644c-tzrds\" (UID: \"0a93bc27-2eb6-438b-bc9c-cab665f898f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.783486 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.800264 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.811444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.811486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.811503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.811527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.811547 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:58Z","lastTransitionTime":"2025-10-06T14:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.815089 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.833188 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.842958 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/1.log" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.843840 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/0.log" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.846883 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be" exitCode=1 Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.846988 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be"} Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.847071 4763 scope.go:117] "RemoveContainer" containerID="09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.848390 4763 scope.go:117] "RemoveContainer" containerID="5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be" Oct 06 14:53:58 crc kubenswrapper[4763]: E1006 14:53:58.848724 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.854824 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.869122 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.883637 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.901938 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.914394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.914481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.914506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.914538 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.914568 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:58Z","lastTransitionTime":"2025-10-06T14:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.919860 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.931355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.939668 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: W1006 14:53:58.944958 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a93bc27_2eb6_438b_bc9c_cab665f898f3.slice/crio-f8f708271551168e8eb4e3b0c829d5a66b105b0b07be50173a1dac02b8b3b0ad WatchSource:0}: Error finding container f8f708271551168e8eb4e3b0c829d5a66b105b0b07be50173a1dac02b8b3b0ad: Status 404 returned error can't find the container with id f8f708271551168e8eb4e3b0c829d5a66b105b0b07be50173a1dac02b8b3b0ad Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.976909 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:55Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 14:53:55.933050 6105 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 14:53:55.933281 6105 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 14:53:55.933495 6105 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 14:53:55.933779 6105 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.933927 6105 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934026 6105 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934078 6105 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934586 6105 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:57Z\\\",\\\"message\\\":\\\"Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:53:57.818160 6246 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nI1006 14:53:57.818165 6246 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-bj6z5\\\\nI1006 14:53:57.818169 6246 services_controller.go:360] Finished syncing service route-controller-manager on namespace openshift-route-controller-manager for network=default : 1.138324ms\\\\nI1006 14:53:57.818180 6246 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI1006 14:53:57.818194 6246 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1006 14:53:57.817792 6246 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initial\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:58 crc kubenswrapper[4763]: I1006 14:53:58.988991 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:58Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.004823 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.017329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.017435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.017455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.017528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.017547 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:59Z","lastTransitionTime":"2025-10-06T14:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.022289 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.038469 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.053927 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.067775 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.084879 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.099647 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.112036 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.120367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.120400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.120415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.120433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.120444 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:59Z","lastTransitionTime":"2025-10-06T14:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.128509 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.223330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.223667 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.223681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.223700 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.223722 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:59Z","lastTransitionTime":"2025-10-06T14:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.328445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.328524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.328550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.328582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.328608 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:59Z","lastTransitionTime":"2025-10-06T14:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.431920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.432000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.432023 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.432054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.432075 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:59Z","lastTransitionTime":"2025-10-06T14:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.535847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.535897 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.535913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.535934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.535948 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:59Z","lastTransitionTime":"2025-10-06T14:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.574661 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:53:59 crc kubenswrapper[4763]: E1006 14:53:59.574816 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.639180 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.639238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.639258 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.639283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.639300 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:59Z","lastTransitionTime":"2025-10-06T14:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.742810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.742855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.742865 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.742883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.742896 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:59Z","lastTransitionTime":"2025-10-06T14:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.845572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.845692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.845719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.845749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.845766 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:59Z","lastTransitionTime":"2025-10-06T14:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.859069 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" event={"ID":"0a93bc27-2eb6-438b-bc9c-cab665f898f3","Type":"ContainerStarted","Data":"46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.859139 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" event={"ID":"0a93bc27-2eb6-438b-bc9c-cab665f898f3","Type":"ContainerStarted","Data":"97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.859168 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" event={"ID":"0a93bc27-2eb6-438b-bc9c-cab665f898f3","Type":"ContainerStarted","Data":"f8f708271551168e8eb4e3b0c829d5a66b105b0b07be50173a1dac02b8b3b0ad"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.863126 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/1.log" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.869870 4763 scope.go:117] "RemoveContainer" containerID="5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be" Oct 06 14:53:59 crc kubenswrapper[4763]: E1006 14:53:59.870133 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.877286 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.898021 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.942022 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.949255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.949305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.949320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.949341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.949354 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:53:59Z","lastTransitionTime":"2025-10-06T14:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.981461 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:53:59 crc kubenswrapper[4763]: I1006 14:53:59.995377 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:53:59Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.008000 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.027696 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.043139 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.051687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.051717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.051728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.051744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.051755 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:00Z","lastTransitionTime":"2025-10-06T14:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.057022 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.073364 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09d80c83ce9cafde0193e9812116b2174d8a82a0df1eb5b04aefcdc8fd1e333b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:55Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 14:53:55.933050 6105 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 14:53:55.933281 6105 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 14:53:55.933495 6105 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 14:53:55.933779 6105 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.933927 6105 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934026 6105 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934078 6105 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 14:53:55.934586 6105 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:57Z\\\",\\\"message\\\":\\\"Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:53:57.818160 6246 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nI1006 14:53:57.818165 6246 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-bj6z5\\\\nI1006 14:53:57.818169 6246 services_controller.go:360] Finished syncing service route-controller-manager on namespace openshift-route-controller-manager for network=default : 1.138324ms\\\\nI1006 14:53:57.818180 6246 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI1006 14:53:57.818194 6246 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1006 14:53:57.817792 6246 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initial\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.075549 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-hgd8l"] Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.076086 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.076154 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.084998 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.093937 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.105667 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.117845 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.129584 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.138362 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.147383 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.153831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.153861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.153870 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.153884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.153894 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:00Z","lastTransitionTime":"2025-10-06T14:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.157946 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.167180 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9g54\" (UniqueName: \"kubernetes.io/projected/d6aeb0e7-db42-449d-8052-fc68154e93d2-kube-api-access-w9g54\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.167243 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.173890 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.185901 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.201018 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.214352 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.226120 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.241539 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.255756 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.255782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.255790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.255806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.255829 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:00Z","lastTransitionTime":"2025-10-06T14:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.258609 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.268315 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.268438 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268462 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:54:16.268432023 +0000 UTC m=+53.423724575 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.268509 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268556 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.268561 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268575 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268587 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.268600 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268644 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268678 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268648 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:16.268634319 +0000 UTC m=+53.423926831 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268722 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs podName:d6aeb0e7-db42-449d-8052-fc68154e93d2 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:00.768710691 +0000 UTC m=+37.924003303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs") pod "network-metrics-daemon-hgd8l" (UID: "d6aeb0e7-db42-449d-8052-fc68154e93d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.268741 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9g54\" (UniqueName: \"kubernetes.io/projected/d6aeb0e7-db42-449d-8052-fc68154e93d2-kube-api-access-w9g54\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268749 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268800 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:16.268785323 +0000 UTC m=+53.424077865 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.268820 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:16.268810314 +0000 UTC m=+53.424102856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.273194 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.285434 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.286490 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9g54\" (UniqueName: \"kubernetes.io/projected/d6aeb0e7-db42-449d-8052-fc68154e93d2-kube-api-access-w9g54\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.302359 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.319798 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:57Z\\\",\\\"message\\\":\\\"Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:53:57.818160 6246 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nI1006 14:53:57.818165 6246 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-bj6z5\\\\nI1006 14:53:57.818169 6246 services_controller.go:360] Finished syncing service route-controller-manager on namespace openshift-route-controller-manager for network=default : 1.138324ms\\\\nI1006 14:53:57.818180 6246 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI1006 14:53:57.818194 6246 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1006 14:53:57.817792 6246 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initial\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.329451 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.340283 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:00Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.358923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.358953 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.358964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.358986 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.358998 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:00Z","lastTransitionTime":"2025-10-06T14:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.369713 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.369876 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.369900 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.369910 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.369942 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:16.369930821 +0000 UTC m=+53.525223333 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.462087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.462127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.462140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.462158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.462169 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:00Z","lastTransitionTime":"2025-10-06T14:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.564255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.564290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.564303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.564320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.564330 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:00Z","lastTransitionTime":"2025-10-06T14:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.574936 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.575046 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.575097 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.575204 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.672550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.672652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.672675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.672702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.672722 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:00Z","lastTransitionTime":"2025-10-06T14:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.774340 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.774523 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: E1006 14:54:00.774654 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs podName:d6aeb0e7-db42-449d-8052-fc68154e93d2 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:01.774594624 +0000 UTC m=+38.929887176 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs") pod "network-metrics-daemon-hgd8l" (UID: "d6aeb0e7-db42-449d-8052-fc68154e93d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.776035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.776134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.776161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.776195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.776216 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:00Z","lastTransitionTime":"2025-10-06T14:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.878411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.878468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.878485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.878510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.878528 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:00Z","lastTransitionTime":"2025-10-06T14:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.981500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.981573 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.981591 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.981644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:00 crc kubenswrapper[4763]: I1006 14:54:00.981666 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:00Z","lastTransitionTime":"2025-10-06T14:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.084566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.084672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.084685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.084729 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.084743 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:01Z","lastTransitionTime":"2025-10-06T14:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.187950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.188015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.188040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.188071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.188093 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:01Z","lastTransitionTime":"2025-10-06T14:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.291535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.291582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.291599 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.291669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.291694 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:01Z","lastTransitionTime":"2025-10-06T14:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.394467 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.394526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.394549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.394577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.394595 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:01Z","lastTransitionTime":"2025-10-06T14:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.497418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.497492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.497514 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.497541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.497558 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:01Z","lastTransitionTime":"2025-10-06T14:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.575004 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.575132 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:01 crc kubenswrapper[4763]: E1006 14:54:01.575239 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:01 crc kubenswrapper[4763]: E1006 14:54:01.575357 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.600440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.600510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.600534 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.600563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.600586 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:01Z","lastTransitionTime":"2025-10-06T14:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.704496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.704551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.704570 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.704595 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.704649 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:01Z","lastTransitionTime":"2025-10-06T14:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.786539 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:01 crc kubenswrapper[4763]: E1006 14:54:01.786741 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:01 crc kubenswrapper[4763]: E1006 14:54:01.786811 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs podName:d6aeb0e7-db42-449d-8052-fc68154e93d2 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:03.786793254 +0000 UTC m=+40.942085786 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs") pod "network-metrics-daemon-hgd8l" (UID: "d6aeb0e7-db42-449d-8052-fc68154e93d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.807411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.807454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.807469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.807492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.807504 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:01Z","lastTransitionTime":"2025-10-06T14:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.910608 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.910694 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.910705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.910723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.910734 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:01Z","lastTransitionTime":"2025-10-06T14:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.974958 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 14:54:01 crc kubenswrapper[4763]: I1006 14:54:01.997567 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:01Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.014413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.014476 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.014496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.014523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.014547 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:02Z","lastTransitionTime":"2025-10-06T14:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.017045 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.046377 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:57Z\\\",\\\"message\\\":\\\"Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:53:57.818160 6246 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nI1006 14:53:57.818165 6246 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-bj6z5\\\\nI1006 14:53:57.818169 6246 services_controller.go:360] Finished syncing service route-controller-manager on namespace openshift-route-controller-manager for network=default : 1.138324ms\\\\nI1006 14:53:57.818180 6246 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI1006 14:53:57.818194 6246 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1006 14:53:57.817792 6246 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initial\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.062404 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.079877 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.100157 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.117259 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.118273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.118356 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.118383 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.118415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.118441 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:02Z","lastTransitionTime":"2025-10-06T14:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.135178 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.155729 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.173689 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.190865 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.212710 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.221253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.221465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.221587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.221747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.221855 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:02Z","lastTransitionTime":"2025-10-06T14:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.231050 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.243555 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.272185 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.291979 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:02Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.324736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.324798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.324816 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.324841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.324858 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:02Z","lastTransitionTime":"2025-10-06T14:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.428229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.428519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.428606 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.428731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.428846 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:02Z","lastTransitionTime":"2025-10-06T14:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.531578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.531656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.531668 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.531685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.531698 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:02Z","lastTransitionTime":"2025-10-06T14:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.574299 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.574379 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:02 crc kubenswrapper[4763]: E1006 14:54:02.574450 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:02 crc kubenswrapper[4763]: E1006 14:54:02.574680 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.635050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.635107 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.635124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.635149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.635166 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:02Z","lastTransitionTime":"2025-10-06T14:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.738822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.738877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.738894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.738946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.738965 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:02Z","lastTransitionTime":"2025-10-06T14:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.842030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.842072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.842088 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.842113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.842131 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:02Z","lastTransitionTime":"2025-10-06T14:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.945827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.945905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.945928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.945958 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:02 crc kubenswrapper[4763]: I1006 14:54:02.945982 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:02Z","lastTransitionTime":"2025-10-06T14:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.050054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.050111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.050121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.050139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.050153 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:03Z","lastTransitionTime":"2025-10-06T14:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.153388 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.153777 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.153866 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.154034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.154129 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:03Z","lastTransitionTime":"2025-10-06T14:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.257461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.257524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.257536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.257578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.257592 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:03Z","lastTransitionTime":"2025-10-06T14:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.360827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.360892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.360905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.360923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.360934 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:03Z","lastTransitionTime":"2025-10-06T14:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.463941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.464273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.464408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.464532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.464685 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:03Z","lastTransitionTime":"2025-10-06T14:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.567384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.567449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.567475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.567512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.567535 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:03Z","lastTransitionTime":"2025-10-06T14:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.575066 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.575196 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:03 crc kubenswrapper[4763]: E1006 14:54:03.575371 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:03 crc kubenswrapper[4763]: E1006 14:54:03.575749 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.598087 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.631874 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:57Z\\\",\\\"message\\\":\\\"Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:53:57.818160 6246 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nI1006 14:53:57.818165 6246 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-bj6z5\\\\nI1006 14:53:57.818169 6246 services_controller.go:360] Finished syncing service route-controller-manager on namespace openshift-route-controller-manager for network=default : 1.138324ms\\\\nI1006 14:53:57.818180 6246 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI1006 14:53:57.818194 6246 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1006 14:53:57.817792 6246 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initial\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.651189 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.670247 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.670309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.670325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.670352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.670370 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:03Z","lastTransitionTime":"2025-10-06T14:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.673663 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.693268 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.711795 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.728071 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.744263 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.761655 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.773240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.773272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.773281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.773295 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.773324 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:03Z","lastTransitionTime":"2025-10-06T14:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.782700 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.802450 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.810743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:03 crc kubenswrapper[4763]: E1006 14:54:03.811088 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:03 crc kubenswrapper[4763]: E1006 14:54:03.811216 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs podName:d6aeb0e7-db42-449d-8052-fc68154e93d2 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:07.811187365 +0000 UTC m=+44.966479917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs") pod "network-metrics-daemon-hgd8l" (UID: "d6aeb0e7-db42-449d-8052-fc68154e93d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.824959 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.845683 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.859185 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.875335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.875436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.875457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.875483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.875503 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:03Z","lastTransitionTime":"2025-10-06T14:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.883692 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.902773 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:03Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.978181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.978236 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.978253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.978277 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:03 crc kubenswrapper[4763]: I1006 14:54:03.978293 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:03Z","lastTransitionTime":"2025-10-06T14:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.081112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.081165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.081183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.081218 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.081238 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.184269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.184308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.184316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.184332 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.184342 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.287096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.287178 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.287203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.287235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.287257 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.305733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.305912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.305951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.305981 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.306001 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: E1006 14:54:04.329773 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:04Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.339154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.339295 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.339316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.339350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.339599 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: E1006 14:54:04.360237 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:04Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.365241 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.365294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.365306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.365325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.365337 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: E1006 14:54:04.382496 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:04Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.387449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.387508 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.387526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.387550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.387569 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.407690 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:54:04 crc kubenswrapper[4763]: E1006 14:54:04.408083 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:04Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.408911 4763 scope.go:117] "RemoveContainer" containerID="5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be" Oct 06 14:54:04 crc kubenswrapper[4763]: E1006 14:54:04.409137 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.413512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.413547 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.413560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.413576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.413592 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: E1006 14:54:04.431201 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:04Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:04 crc kubenswrapper[4763]: E1006 14:54:04.431364 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.434068 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.434287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.434316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.434346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.434366 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.537074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.537122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.537130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.537146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.537156 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.574743 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:04 crc kubenswrapper[4763]: E1006 14:54:04.574960 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.574743 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:04 crc kubenswrapper[4763]: E1006 14:54:04.575786 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.640542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.640602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.640650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.640672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.640683 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.743827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.743890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.743908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.743937 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.743956 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.847003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.847057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.847074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.847108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.847127 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.949064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.949093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.949101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.949115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:04 crc kubenswrapper[4763]: I1006 14:54:04.949124 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:04Z","lastTransitionTime":"2025-10-06T14:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.051781 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.051836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.051855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.051882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.051899 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:05Z","lastTransitionTime":"2025-10-06T14:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.154756 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.154823 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.154843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.154870 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.154892 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:05Z","lastTransitionTime":"2025-10-06T14:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.258357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.258416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.258435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.258464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.258481 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:05Z","lastTransitionTime":"2025-10-06T14:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.361298 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.361355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.361370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.361393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.361410 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:05Z","lastTransitionTime":"2025-10-06T14:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.464332 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.464415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.464438 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.464472 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.464499 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:05Z","lastTransitionTime":"2025-10-06T14:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.567536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.567595 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.567641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.567667 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.567685 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:05Z","lastTransitionTime":"2025-10-06T14:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.575127 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.575201 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:05 crc kubenswrapper[4763]: E1006 14:54:05.575291 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:05 crc kubenswrapper[4763]: E1006 14:54:05.575498 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.670468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.670512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.670522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.670540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.670551 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:05Z","lastTransitionTime":"2025-10-06T14:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.773937 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.774038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.774057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.774539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.774594 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:05Z","lastTransitionTime":"2025-10-06T14:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.877898 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.877960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.877980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.878009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.878032 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:05Z","lastTransitionTime":"2025-10-06T14:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.981341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.981418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.981441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.981471 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:05 crc kubenswrapper[4763]: I1006 14:54:05.981491 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:05Z","lastTransitionTime":"2025-10-06T14:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.083828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.083856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.083866 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.083884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.083894 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:06Z","lastTransitionTime":"2025-10-06T14:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.186271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.186337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.186356 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.186382 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.186400 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:06Z","lastTransitionTime":"2025-10-06T14:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.289536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.289593 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.289610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.289660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.289679 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:06Z","lastTransitionTime":"2025-10-06T14:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.392538 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.392586 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.392598 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.392637 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.392652 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:06Z","lastTransitionTime":"2025-10-06T14:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.495466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.495529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.495545 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.495571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.495589 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:06Z","lastTransitionTime":"2025-10-06T14:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.574510 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.574518 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:06 crc kubenswrapper[4763]: E1006 14:54:06.574710 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:06 crc kubenswrapper[4763]: E1006 14:54:06.574897 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.623690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.623766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.623790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.623823 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.623845 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:06Z","lastTransitionTime":"2025-10-06T14:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.727057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.727121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.727141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.727168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.727185 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:06Z","lastTransitionTime":"2025-10-06T14:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.830394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.830432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.830443 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.830460 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.830471 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:06Z","lastTransitionTime":"2025-10-06T14:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.933903 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.933987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.934010 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.934045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:06 crc kubenswrapper[4763]: I1006 14:54:06.934067 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:06Z","lastTransitionTime":"2025-10-06T14:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.037383 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.037487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.037510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.037541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.037562 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:07Z","lastTransitionTime":"2025-10-06T14:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.140788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.140857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.140876 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.140904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.140924 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:07Z","lastTransitionTime":"2025-10-06T14:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.245352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.245421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.245443 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.245475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.245497 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:07Z","lastTransitionTime":"2025-10-06T14:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.348969 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.349016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.349033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.349058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.349073 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:07Z","lastTransitionTime":"2025-10-06T14:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.451580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.451671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.451691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.451718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.451739 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:07Z","lastTransitionTime":"2025-10-06T14:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.555389 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.555452 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.555463 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.555485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.555498 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:07Z","lastTransitionTime":"2025-10-06T14:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.574213 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:07 crc kubenswrapper[4763]: E1006 14:54:07.574418 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.574775 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:07 crc kubenswrapper[4763]: E1006 14:54:07.575012 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.658545 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.658675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.658704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.658739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.658770 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:07Z","lastTransitionTime":"2025-10-06T14:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.762223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.762286 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.762304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.762329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.762346 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:07Z","lastTransitionTime":"2025-10-06T14:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.857187 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:07 crc kubenswrapper[4763]: E1006 14:54:07.857442 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:07 crc kubenswrapper[4763]: E1006 14:54:07.857557 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs podName:d6aeb0e7-db42-449d-8052-fc68154e93d2 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:15.857523756 +0000 UTC m=+53.012816298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs") pod "network-metrics-daemon-hgd8l" (UID: "d6aeb0e7-db42-449d-8052-fc68154e93d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.864874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.864956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.865003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.865027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.865043 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:07Z","lastTransitionTime":"2025-10-06T14:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.967244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.967297 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.967315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.967339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:07 crc kubenswrapper[4763]: I1006 14:54:07.967355 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:07Z","lastTransitionTime":"2025-10-06T14:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.069967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.070030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.070048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.070073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.070089 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:08Z","lastTransitionTime":"2025-10-06T14:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.173077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.173175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.173197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.173228 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.173247 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:08Z","lastTransitionTime":"2025-10-06T14:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.276348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.276403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.276419 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.276444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.276463 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:08Z","lastTransitionTime":"2025-10-06T14:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.380315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.380385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.380407 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.380437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.380462 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:08Z","lastTransitionTime":"2025-10-06T14:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.483905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.483986 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.484011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.484045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.484071 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:08Z","lastTransitionTime":"2025-10-06T14:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.574815 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.574908 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:08 crc kubenswrapper[4763]: E1006 14:54:08.575405 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:08 crc kubenswrapper[4763]: E1006 14:54:08.575605 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.587443 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.587837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.588005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.588171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.588319 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:08Z","lastTransitionTime":"2025-10-06T14:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.691988 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.692049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.692075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.692103 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.692126 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:08Z","lastTransitionTime":"2025-10-06T14:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.795246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.795301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.795321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.795347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.795363 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:08Z","lastTransitionTime":"2025-10-06T14:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.898792 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.898989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.899022 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.899102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:08 crc kubenswrapper[4763]: I1006 14:54:08.899130 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:08Z","lastTransitionTime":"2025-10-06T14:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.002766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.002885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.002908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.002938 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.002959 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:09Z","lastTransitionTime":"2025-10-06T14:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.105996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.106073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.106096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.106126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.106147 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:09Z","lastTransitionTime":"2025-10-06T14:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.209216 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.209272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.209292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.209319 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.209341 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:09Z","lastTransitionTime":"2025-10-06T14:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.312893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.312957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.312977 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.313004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.313020 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:09Z","lastTransitionTime":"2025-10-06T14:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.415785 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.415861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.415883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.415912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.415932 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:09Z","lastTransitionTime":"2025-10-06T14:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.518840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.518906 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.518928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.518956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.518978 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:09Z","lastTransitionTime":"2025-10-06T14:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.574967 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.575023 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:09 crc kubenswrapper[4763]: E1006 14:54:09.575176 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:09 crc kubenswrapper[4763]: E1006 14:54:09.575303 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.622290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.622349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.622368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.622396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.622421 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:09Z","lastTransitionTime":"2025-10-06T14:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.725829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.725911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.725935 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.725971 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.726001 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:09Z","lastTransitionTime":"2025-10-06T14:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.829105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.829172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.829189 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.829216 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.829234 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:09Z","lastTransitionTime":"2025-10-06T14:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.932427 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.932503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.932525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.932555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:09 crc kubenswrapper[4763]: I1006 14:54:09.932578 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:09Z","lastTransitionTime":"2025-10-06T14:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.035991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.036061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.036084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.036117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.036140 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:10Z","lastTransitionTime":"2025-10-06T14:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.138834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.138891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.138909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.138936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.138953 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:10Z","lastTransitionTime":"2025-10-06T14:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.242413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.242513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.242531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.242557 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.242575 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:10Z","lastTransitionTime":"2025-10-06T14:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.345415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.345469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.345489 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.345517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.345536 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:10Z","lastTransitionTime":"2025-10-06T14:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.450771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.450820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.450832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.450850 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.450867 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:10Z","lastTransitionTime":"2025-10-06T14:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.553852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.553913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.553935 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.553966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.553988 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:10Z","lastTransitionTime":"2025-10-06T14:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.574466 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.574548 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:10 crc kubenswrapper[4763]: E1006 14:54:10.574659 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:10 crc kubenswrapper[4763]: E1006 14:54:10.574772 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.656546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.656606 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.656647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.656674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.656691 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:10Z","lastTransitionTime":"2025-10-06T14:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.759799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.759864 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.759883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.759909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.759926 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:10Z","lastTransitionTime":"2025-10-06T14:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.863198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.863237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.863246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.863262 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.863273 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:10Z","lastTransitionTime":"2025-10-06T14:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.965867 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.965925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.965943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.965971 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:10 crc kubenswrapper[4763]: I1006 14:54:10.965989 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:10Z","lastTransitionTime":"2025-10-06T14:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.068666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.068723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.068741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.068766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.068788 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:11Z","lastTransitionTime":"2025-10-06T14:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.171095 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.171144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.171155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.171173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.171184 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:11Z","lastTransitionTime":"2025-10-06T14:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.274136 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.274176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.274185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.274202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.274213 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:11Z","lastTransitionTime":"2025-10-06T14:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.377829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.377895 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.377918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.377947 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.377967 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:11Z","lastTransitionTime":"2025-10-06T14:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.481398 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.481461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.481480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.481504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.481522 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:11Z","lastTransitionTime":"2025-10-06T14:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.574360 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.574501 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:11 crc kubenswrapper[4763]: E1006 14:54:11.574550 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:11 crc kubenswrapper[4763]: E1006 14:54:11.574792 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.584441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.584513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.584531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.584558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.584575 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:11Z","lastTransitionTime":"2025-10-06T14:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.688056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.688137 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.688155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.688179 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.688196 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:11Z","lastTransitionTime":"2025-10-06T14:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.791997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.792056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.792077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.792102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.792123 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:11Z","lastTransitionTime":"2025-10-06T14:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.895799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.896197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.896338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.896474 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:11 crc kubenswrapper[4763]: I1006 14:54:11.896652 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:11Z","lastTransitionTime":"2025-10-06T14:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.000395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.000453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.000471 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.000495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.000512 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:12Z","lastTransitionTime":"2025-10-06T14:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.103574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.103667 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.103684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.103709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.103728 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:12Z","lastTransitionTime":"2025-10-06T14:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.206333 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.206794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.206842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.206872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.206896 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:12Z","lastTransitionTime":"2025-10-06T14:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.309817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.309891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.309916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.309946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.309966 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:12Z","lastTransitionTime":"2025-10-06T14:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.412787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.412850 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.412872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.412902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.412923 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:12Z","lastTransitionTime":"2025-10-06T14:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.515925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.516355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.516522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.516713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.516875 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:12Z","lastTransitionTime":"2025-10-06T14:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.574139 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:12 crc kubenswrapper[4763]: E1006 14:54:12.574348 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.574599 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:12 crc kubenswrapper[4763]: E1006 14:54:12.574877 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.620745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.620828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.620854 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.620886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.620919 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:12Z","lastTransitionTime":"2025-10-06T14:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.723395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.723454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.723473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.723496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.723513 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:12Z","lastTransitionTime":"2025-10-06T14:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.826359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.826415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.826440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.826468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.826521 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:12Z","lastTransitionTime":"2025-10-06T14:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.930051 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.930122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.930141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.930167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:12 crc kubenswrapper[4763]: I1006 14:54:12.930183 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:12Z","lastTransitionTime":"2025-10-06T14:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.033761 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.033863 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.033902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.033933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.033971 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:13Z","lastTransitionTime":"2025-10-06T14:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.136847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.136893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.136909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.136934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.136957 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:13Z","lastTransitionTime":"2025-10-06T14:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.240963 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.241059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.241078 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.241133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.241152 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:13Z","lastTransitionTime":"2025-10-06T14:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.344473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.344517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.344529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.344548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.344561 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:13Z","lastTransitionTime":"2025-10-06T14:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.447425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.447483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.447507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.447537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.447557 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:13Z","lastTransitionTime":"2025-10-06T14:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.549815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.549877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.549896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.549921 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.549938 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:13Z","lastTransitionTime":"2025-10-06T14:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.574278 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.574422 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:13 crc kubenswrapper[4763]: E1006 14:54:13.574564 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:13 crc kubenswrapper[4763]: E1006 14:54:13.574804 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.594071 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.617462 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.640008 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.652419 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.653278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.653341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.653351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.653431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.653443 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:13Z","lastTransitionTime":"2025-10-06T14:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.666465 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.683775 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.702447 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.719723 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.739226 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.756687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.756758 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.756780 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.756811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.756832 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:13Z","lastTransitionTime":"2025-10-06T14:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.756932 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.793683 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:57Z\\\",\\\"message\\\":\\\"Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:53:57.818160 6246 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nI1006 14:53:57.818165 6246 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-bj6z5\\\\nI1006 14:53:57.818169 6246 services_controller.go:360] Finished syncing service route-controller-manager on namespace openshift-route-controller-manager for network=default : 1.138324ms\\\\nI1006 14:53:57.818180 6246 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI1006 14:53:57.818194 6246 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1006 14:53:57.817792 6246 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initial\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.812866 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.840720 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.858741 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.860292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.860337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.860353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.860377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.860394 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:13Z","lastTransitionTime":"2025-10-06T14:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.877568 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.891086 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:13Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.963333 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.963369 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.963393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.963416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:13 crc kubenswrapper[4763]: I1006 14:54:13.963429 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:13Z","lastTransitionTime":"2025-10-06T14:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.066742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.066804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.066821 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.066847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.066868 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.170035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.170114 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.170130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.170184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.170205 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.274783 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.274872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.274891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.274962 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.274979 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.378770 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.378813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.378821 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.378838 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.378848 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.481288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.481672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.481824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.481936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.482044 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.574589 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.574647 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:14 crc kubenswrapper[4763]: E1006 14:54:14.574761 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:14 crc kubenswrapper[4763]: E1006 14:54:14.574867 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.585217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.585255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.585271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.585327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.585346 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.688722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.688795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.688819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.688852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.688875 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.714503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.714768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.714837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.714908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.714975 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: E1006 14:54:14.737235 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:14Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.741905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.741973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.741991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.742019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.742037 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: E1006 14:54:14.762424 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:14Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.767706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.767767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.767787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.767813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.767831 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: E1006 14:54:14.787814 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:14Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.799394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.799444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.799455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.799473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.799485 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: E1006 14:54:14.818707 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:14Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.824149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.824201 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.824218 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.824243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.824261 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: E1006 14:54:14.844432 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:14Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:14 crc kubenswrapper[4763]: E1006 14:54:14.844736 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.847246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.847301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.847321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.847346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.847363 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.949665 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.949995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.950207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.950397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:14 crc kubenswrapper[4763]: I1006 14:54:14.950534 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:14Z","lastTransitionTime":"2025-10-06T14:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.053009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.053052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.053062 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.053081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.053091 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:15Z","lastTransitionTime":"2025-10-06T14:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.156413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.156477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.156496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.156522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.156540 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:15Z","lastTransitionTime":"2025-10-06T14:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.259813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.259887 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.259910 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.259940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.259962 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:15Z","lastTransitionTime":"2025-10-06T14:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.363067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.363130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.363150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.363176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.363197 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:15Z","lastTransitionTime":"2025-10-06T14:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.467353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.467896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.467967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.468037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.468151 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:15Z","lastTransitionTime":"2025-10-06T14:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.571227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.571288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.571299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.571318 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.571329 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:15Z","lastTransitionTime":"2025-10-06T14:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.574685 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.574753 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:15 crc kubenswrapper[4763]: E1006 14:54:15.574858 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:15 crc kubenswrapper[4763]: E1006 14:54:15.574977 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.673932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.673992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.674009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.674034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.674051 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:15Z","lastTransitionTime":"2025-10-06T14:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.776981 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.777032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.777046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.777067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.777082 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:15Z","lastTransitionTime":"2025-10-06T14:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.880156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.880213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.880225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.880244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.880258 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:15Z","lastTransitionTime":"2025-10-06T14:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.951549 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:15 crc kubenswrapper[4763]: E1006 14:54:15.952355 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:15 crc kubenswrapper[4763]: E1006 14:54:15.952682 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs podName:d6aeb0e7-db42-449d-8052-fc68154e93d2 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:31.952607746 +0000 UTC m=+69.107900298 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs") pod "network-metrics-daemon-hgd8l" (UID: "d6aeb0e7-db42-449d-8052-fc68154e93d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.983500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.983574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.983599 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.983667 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:15 crc kubenswrapper[4763]: I1006 14:54:15.983685 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:15Z","lastTransitionTime":"2025-10-06T14:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.085963 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.086029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.086047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.086076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.086094 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:16Z","lastTransitionTime":"2025-10-06T14:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.188584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.188679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.188691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.188710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.188721 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:16Z","lastTransitionTime":"2025-10-06T14:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.291410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.291481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.291500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.291528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.291547 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:16Z","lastTransitionTime":"2025-10-06T14:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.356032 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.356136 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.356170 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.356220 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:54:48.356176468 +0000 UTC m=+85.511469030 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.356279 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.356297 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.356326 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:48.356312712 +0000 UTC m=+85.511605224 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.356334 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.356388 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:48.356366814 +0000 UTC m=+85.511659366 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.356564 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.356595 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.356656 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.356733 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:48.356703994 +0000 UTC m=+85.511996586 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.394338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.394403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.394429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.394462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.394486 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:16Z","lastTransitionTime":"2025-10-06T14:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.409550 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.423673 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.428527 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.449279 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.457741 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.457932 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.457967 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.457986 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.458059 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 14:54:48.458038647 +0000 UTC m=+85.613331199 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.469205 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.484036 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.497425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.497469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.497481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.497500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.497512 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:16Z","lastTransitionTime":"2025-10-06T14:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.500414 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.517416 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.537201 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.552203 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.574071 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.574301 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.574074 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:16 crc kubenswrapper[4763]: E1006 14:54:16.574922 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.579677 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.599790 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.601174 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.601214 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.601244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.601263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.601275 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:16Z","lastTransitionTime":"2025-10-06T14:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.613083 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.627808 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.647247 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:57Z\\\",\\\"message\\\":\\\"Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:53:57.818160 6246 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nI1006 14:53:57.818165 6246 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-bj6z5\\\\nI1006 14:53:57.818169 6246 services_controller.go:360] Finished syncing service route-controller-manager on namespace openshift-route-controller-manager for network=default : 1.138324ms\\\\nI1006 14:53:57.818180 6246 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI1006 14:53:57.818194 6246 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1006 14:53:57.817792 6246 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initial\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.658262 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.672336 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.686190 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:16Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.704310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.704370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.704385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.704406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.704421 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:16Z","lastTransitionTime":"2025-10-06T14:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.807920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.807991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.808018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.808046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.808064 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:16Z","lastTransitionTime":"2025-10-06T14:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.911034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.911113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.911140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.911173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:16 crc kubenswrapper[4763]: I1006 14:54:16.911195 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:16Z","lastTransitionTime":"2025-10-06T14:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.013569 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.013668 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.013695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.013727 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.013748 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:17Z","lastTransitionTime":"2025-10-06T14:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.117048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.117404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.117580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.117803 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.117983 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:17Z","lastTransitionTime":"2025-10-06T14:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.221918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.222345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.222363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.222390 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.222407 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:17Z","lastTransitionTime":"2025-10-06T14:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.325407 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.325710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.325800 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.325869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.325931 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:17Z","lastTransitionTime":"2025-10-06T14:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.429073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.429146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.429168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.429196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.429216 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:17Z","lastTransitionTime":"2025-10-06T14:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.532768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.532842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.532853 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.532875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.532889 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:17Z","lastTransitionTime":"2025-10-06T14:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.574490 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.574897 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:17 crc kubenswrapper[4763]: E1006 14:54:17.575185 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:17 crc kubenswrapper[4763]: E1006 14:54:17.575411 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.577172 4763 scope.go:117] "RemoveContainer" containerID="5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.638292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.638357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.638367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.638394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.638407 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:17Z","lastTransitionTime":"2025-10-06T14:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.740934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.741225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.741321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.741423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.741507 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:17Z","lastTransitionTime":"2025-10-06T14:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.844901 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.844936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.844945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.844958 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.844967 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:17Z","lastTransitionTime":"2025-10-06T14:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.939430 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/1.log" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.944067 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.944829 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.949854 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.949961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.949981 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.950019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.950042 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:17Z","lastTransitionTime":"2025-10-06T14:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.967192 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:17Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:17 crc kubenswrapper[4763]: I1006 14:54:17.985541 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:17Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.002100 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:17Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.018069 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.028962 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.063050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.063109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.063123 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.063147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.063161 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:18Z","lastTransitionTime":"2025-10-06T14:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.070632 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.108488 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.142829 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.165747 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.166018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.166034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.166045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.166063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.166075 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:18Z","lastTransitionTime":"2025-10-06T14:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.182714 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.195351 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.213012 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:57Z\\\",\\\"message\\\":\\\"Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:53:57.818160 6246 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nI1006 14:53:57.818165 6246 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-bj6z5\\\\nI1006 14:53:57.818169 6246 services_controller.go:360] Finished syncing service route-controller-manager on namespace openshift-route-controller-manager for network=default : 1.138324ms\\\\nI1006 14:53:57.818180 6246 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI1006 14:53:57.818194 6246 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1006 14:53:57.817792 6246 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initial\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.226961 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.237719 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.251096 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.265679 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.268791 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.268843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.268861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.268885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.268901 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:18Z","lastTransitionTime":"2025-10-06T14:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.279113 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.371703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.371744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.371755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.371775 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.371787 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:18Z","lastTransitionTime":"2025-10-06T14:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.474456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.474528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.474552 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.474585 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.474604 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:18Z","lastTransitionTime":"2025-10-06T14:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.574903 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.574944 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:18 crc kubenswrapper[4763]: E1006 14:54:18.575106 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:18 crc kubenswrapper[4763]: E1006 14:54:18.575271 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.577949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.578001 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.578019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.578040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.578058 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:18Z","lastTransitionTime":"2025-10-06T14:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.680557 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.681017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.681029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.681050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.681065 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:18Z","lastTransitionTime":"2025-10-06T14:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.784966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.785032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.785044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.785062 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.785073 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:18Z","lastTransitionTime":"2025-10-06T14:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.888229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.888283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.888298 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.888317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.888331 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:18Z","lastTransitionTime":"2025-10-06T14:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.951178 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/2.log" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.952364 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/1.log" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.957261 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0" exitCode=1 Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.957325 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.957380 4763 scope.go:117] "RemoveContainer" containerID="5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.958266 4763 scope.go:117] "RemoveContainer" containerID="f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0" Oct 06 14:54:18 crc kubenswrapper[4763]: E1006 14:54:18.958574 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.975777 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.991684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.991732 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.991749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.991774 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.991793 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:18Z","lastTransitionTime":"2025-10-06T14:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:18 crc kubenswrapper[4763]: I1006 14:54:18.993273 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:18Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.011185 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.034317 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.054440 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.079673 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.095741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.095832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.095858 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.095887 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.095907 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:19Z","lastTransitionTime":"2025-10-06T14:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.097140 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.121257 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.145120 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.163970 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.184936 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.198714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.198761 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.198773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.198794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.198808 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:19Z","lastTransitionTime":"2025-10-06T14:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.216964 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a494e140a8d7fdfbe227fbe4197354852c0028039e727825e5a41b2c19c02be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:53:57Z\\\",\\\"message\\\":\\\"Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:53:57.818160 6246 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}\\\\nI1006 14:53:57.818165 6246 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-bj6z5\\\\nI1006 14:53:57.818169 6246 services_controller.go:360] Finished syncing service route-controller-manager on namespace openshift-route-controller-manager for network=default : 1.138324ms\\\\nI1006 14:53:57.818180 6246 services_controller.go:356] Processing sync for service openshift-kube-controller-manager/kube-controller-manager for network=default\\\\nI1006 14:53:57.818194 6246 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1006 14:53:57.817792 6246 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initial\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:18Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657116 6508 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657526 6508 obj_retry.go:551] Creating *factory.egressNode crc took: 2.946146ms\\\\nI1006 14:54:18.657564 6508 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 14:54:18.657637 6508 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 14:54:18.657987 6508 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 14:54:18.658168 6508 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 14:54:18.658216 6508 ovnkube.go:599] Stopped ovnkube\\\\nI1006 14:54:18.658246 6508 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 14:54:18.658335 6508 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.232952 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.247468 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.264353 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.280009 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.298458 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.301418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.301467 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.301485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.301516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.301532 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:19Z","lastTransitionTime":"2025-10-06T14:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.404070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.404125 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.404135 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.404155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.404165 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:19Z","lastTransitionTime":"2025-10-06T14:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.506919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.506998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.507009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.507029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.507041 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:19Z","lastTransitionTime":"2025-10-06T14:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.574940 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.574952 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:19 crc kubenswrapper[4763]: E1006 14:54:19.575153 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:19 crc kubenswrapper[4763]: E1006 14:54:19.575349 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.610714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.610775 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.610787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.610807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.610818 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:19Z","lastTransitionTime":"2025-10-06T14:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.714500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.714565 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.714584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.714610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.714659 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:19Z","lastTransitionTime":"2025-10-06T14:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.817487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.817548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.817563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.817582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.817596 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:19Z","lastTransitionTime":"2025-10-06T14:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.920736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.920788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.920797 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.920817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.920829 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:19Z","lastTransitionTime":"2025-10-06T14:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.962550 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/2.log" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.967155 4763 scope.go:117] "RemoveContainer" containerID="f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0" Oct 06 14:54:19 crc kubenswrapper[4763]: E1006 14:54:19.967323 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.983849 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:19 crc kubenswrapper[4763]: I1006 14:54:19.997603 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:19Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.015912 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.023806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.023983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.024076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.024189 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.024276 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:20Z","lastTransitionTime":"2025-10-06T14:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.031585 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.048088 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.071245 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.085357 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.103530 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.119557 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.126964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.127029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.127101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.127125 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.127137 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:20Z","lastTransitionTime":"2025-10-06T14:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.144979 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.164721 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.185294 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.204654 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.221821 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.230028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.230091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.230108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.230134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.230153 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:20Z","lastTransitionTime":"2025-10-06T14:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.239994 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.265893 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:18Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657116 6508 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657526 6508 obj_retry.go:551] Creating *factory.egressNode crc took: 2.946146ms\\\\nI1006 14:54:18.657564 6508 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 14:54:18.657637 6508 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 14:54:18.657987 6508 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 14:54:18.658168 6508 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 14:54:18.658216 6508 ovnkube.go:599] Stopped ovnkube\\\\nI1006 14:54:18.658246 6508 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 14:54:18.658335 6508 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.279667 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:20Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.333124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.333171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.333193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.333219 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.333238 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:20Z","lastTransitionTime":"2025-10-06T14:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.436320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.436402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.436428 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.436458 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.436483 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:20Z","lastTransitionTime":"2025-10-06T14:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.540339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.540413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.540429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.540454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.540472 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:20Z","lastTransitionTime":"2025-10-06T14:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.574215 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.574215 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:20 crc kubenswrapper[4763]: E1006 14:54:20.574481 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:20 crc kubenswrapper[4763]: E1006 14:54:20.574567 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.643928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.643995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.644016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.644046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.644068 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:20Z","lastTransitionTime":"2025-10-06T14:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.747587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.747680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.747699 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.747724 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.747745 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:20Z","lastTransitionTime":"2025-10-06T14:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.850701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.850774 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.850825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.850857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.850879 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:20Z","lastTransitionTime":"2025-10-06T14:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.954713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.954809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.954834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.954864 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:20 crc kubenswrapper[4763]: I1006 14:54:20.954885 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:20Z","lastTransitionTime":"2025-10-06T14:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.057316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.057376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.057394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.057421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.057440 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:21Z","lastTransitionTime":"2025-10-06T14:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.161093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.161174 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.161196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.161223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.161240 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:21Z","lastTransitionTime":"2025-10-06T14:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.264201 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.264269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.264292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.264323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.264350 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:21Z","lastTransitionTime":"2025-10-06T14:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.366722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.366758 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.366771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.366786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.366797 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:21Z","lastTransitionTime":"2025-10-06T14:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.470016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.470072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.470085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.470109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.470124 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:21Z","lastTransitionTime":"2025-10-06T14:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.573044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.573091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.573100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.573117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.573127 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:21Z","lastTransitionTime":"2025-10-06T14:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.574200 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.574343 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:21 crc kubenswrapper[4763]: E1006 14:54:21.574450 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:21 crc kubenswrapper[4763]: E1006 14:54:21.574606 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.675678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.676040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.676165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.676284 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.676435 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:21Z","lastTransitionTime":"2025-10-06T14:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.779742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.779830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.779852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.779885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.779907 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:21Z","lastTransitionTime":"2025-10-06T14:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.882028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.882064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.882073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.882087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.882095 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:21Z","lastTransitionTime":"2025-10-06T14:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.984588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.984663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.984674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.984690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:21 crc kubenswrapper[4763]: I1006 14:54:21.984700 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:21Z","lastTransitionTime":"2025-10-06T14:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.087692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.087728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.087737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.087752 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.087761 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:22Z","lastTransitionTime":"2025-10-06T14:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.191130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.191208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.191233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.191270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.191295 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:22Z","lastTransitionTime":"2025-10-06T14:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.294274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.294341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.294360 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.294388 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.294411 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:22Z","lastTransitionTime":"2025-10-06T14:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.398251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.398315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.398337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.398362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.398380 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:22Z","lastTransitionTime":"2025-10-06T14:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.501718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.501793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.501816 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.501842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.501858 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:22Z","lastTransitionTime":"2025-10-06T14:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.574271 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:22 crc kubenswrapper[4763]: E1006 14:54:22.574442 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.574804 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:22 crc kubenswrapper[4763]: E1006 14:54:22.574945 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.604299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.604356 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.604376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.604400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.604419 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:22Z","lastTransitionTime":"2025-10-06T14:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.707037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.707094 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.707107 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.707124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.707134 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:22Z","lastTransitionTime":"2025-10-06T14:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.809912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.809967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.809982 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.810005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.810019 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:22Z","lastTransitionTime":"2025-10-06T14:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.913021 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.913078 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.913096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.913121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:22 crc kubenswrapper[4763]: I1006 14:54:22.913139 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:22Z","lastTransitionTime":"2025-10-06T14:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.016027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.016077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.016095 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.016115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.016130 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:23Z","lastTransitionTime":"2025-10-06T14:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.118796 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.118844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.118889 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.118914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.118932 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:23Z","lastTransitionTime":"2025-10-06T14:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.221352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.221411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.221429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.221455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.221474 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:23Z","lastTransitionTime":"2025-10-06T14:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.324676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.324748 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.324770 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.324800 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.324827 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:23Z","lastTransitionTime":"2025-10-06T14:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.427770 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.427830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.427847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.427874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.427891 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:23Z","lastTransitionTime":"2025-10-06T14:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.530364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.530434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.530453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.530478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.530495 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:23Z","lastTransitionTime":"2025-10-06T14:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.574215 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.574199 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:23 crc kubenswrapper[4763]: E1006 14:54:23.574781 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:23 crc kubenswrapper[4763]: E1006 14:54:23.575196 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.596462 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.617418 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.633933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.633989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.634006 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.634030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.634047 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:23Z","lastTransitionTime":"2025-10-06T14:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.635958 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.657453 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.690476 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:18Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657116 6508 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657526 6508 obj_retry.go:551] Creating *factory.egressNode crc took: 2.946146ms\\\\nI1006 14:54:18.657564 6508 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 14:54:18.657637 6508 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 14:54:18.657987 6508 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 14:54:18.658168 6508 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 14:54:18.658216 6508 ovnkube.go:599] Stopped ovnkube\\\\nI1006 14:54:18.658246 6508 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 14:54:18.658335 6508 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.708219 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.722878 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.736469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.736725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.736816 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.736908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.736994 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:23Z","lastTransitionTime":"2025-10-06T14:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.740125 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.758022 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.777283 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.796115 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.813314 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.834315 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.839768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.839824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.839842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.839868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.839891 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:23Z","lastTransitionTime":"2025-10-06T14:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.854173 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.871809 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.891473 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.908754 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:23Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.942828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.942893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.942913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.942941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:23 crc kubenswrapper[4763]: I1006 14:54:23.942960 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:23Z","lastTransitionTime":"2025-10-06T14:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.046136 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.046209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.046228 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.046257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.046276 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:24Z","lastTransitionTime":"2025-10-06T14:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.149340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.149417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.149431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.149449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.149460 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:24Z","lastTransitionTime":"2025-10-06T14:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.252115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.252178 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.252196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.252220 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.252236 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:24Z","lastTransitionTime":"2025-10-06T14:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.355671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.355722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.355756 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.355776 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.355788 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:24Z","lastTransitionTime":"2025-10-06T14:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.459022 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.459079 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.459100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.459135 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.459162 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:24Z","lastTransitionTime":"2025-10-06T14:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.562183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.562251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.562268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.562293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.562309 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:24Z","lastTransitionTime":"2025-10-06T14:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.574599 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.574699 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:24 crc kubenswrapper[4763]: E1006 14:54:24.575176 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:24 crc kubenswrapper[4763]: E1006 14:54:24.575199 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.665917 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.665977 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.665994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.666023 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.666044 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:24Z","lastTransitionTime":"2025-10-06T14:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.769513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.769582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.769606 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.769674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.769693 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:24Z","lastTransitionTime":"2025-10-06T14:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.872144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.872206 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.872225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.872252 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.872272 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:24Z","lastTransitionTime":"2025-10-06T14:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.975755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.975800 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.975809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.975828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:24 crc kubenswrapper[4763]: I1006 14:54:24.975857 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:24Z","lastTransitionTime":"2025-10-06T14:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.039052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.039126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.039143 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.039170 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.039190 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: E1006 14:54:25.060868 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:25Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.066485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.066557 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.066584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.066658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.066688 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: E1006 14:54:25.087388 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:25Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.093119 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.093393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.093652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.093883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.094071 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: E1006 14:54:25.116206 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:25Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.122523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.122691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.122729 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.122779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.122798 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: E1006 14:54:25.144597 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:25Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.150597 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.150728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.150747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.150774 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.150791 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: E1006 14:54:25.171230 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:25Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:25 crc kubenswrapper[4763]: E1006 14:54:25.171980 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.174315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.174386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.174410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.174444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.174467 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.278148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.278202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.278220 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.278253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.278270 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.381983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.382049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.382076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.382102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.382118 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.484585 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.484694 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.484713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.484742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.484760 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.574842 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.574955 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:25 crc kubenswrapper[4763]: E1006 14:54:25.575059 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:25 crc kubenswrapper[4763]: E1006 14:54:25.575174 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.586165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.586199 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.586208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.586218 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.586227 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.690156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.690224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.690242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.690270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.690288 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.793171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.793241 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.793260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.793290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.793311 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.896268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.896923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.897021 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.897118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.897216 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.999338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.999396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.999411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.999432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:25 crc kubenswrapper[4763]: I1006 14:54:25.999447 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:25Z","lastTransitionTime":"2025-10-06T14:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.101676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.101712 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.101720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.101735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.101744 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:26Z","lastTransitionTime":"2025-10-06T14:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.203900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.203940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.203951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.203971 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.203984 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:26Z","lastTransitionTime":"2025-10-06T14:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.306300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.306376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.306395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.306421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.306438 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:26Z","lastTransitionTime":"2025-10-06T14:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.409052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.409120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.409143 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.409176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.409197 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:26Z","lastTransitionTime":"2025-10-06T14:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.513189 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.513280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.513302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.513336 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.513369 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:26Z","lastTransitionTime":"2025-10-06T14:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.575681 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.575743 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:26 crc kubenswrapper[4763]: E1006 14:54:26.575880 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:26 crc kubenswrapper[4763]: E1006 14:54:26.576194 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.615371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.615429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.615442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.615462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.615475 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:26Z","lastTransitionTime":"2025-10-06T14:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.718312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.718367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.718385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.718412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.718429 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:26Z","lastTransitionTime":"2025-10-06T14:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.821038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.821096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.821113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.821141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.821161 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:26Z","lastTransitionTime":"2025-10-06T14:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.924005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.924054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.924067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.924088 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:26 crc kubenswrapper[4763]: I1006 14:54:26.924103 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:26Z","lastTransitionTime":"2025-10-06T14:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.026322 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.026380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.026400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.026430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.026448 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:27Z","lastTransitionTime":"2025-10-06T14:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.129685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.129737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.129749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.129769 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.129783 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:27Z","lastTransitionTime":"2025-10-06T14:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.233233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.233283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.233299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.233325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.233342 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:27Z","lastTransitionTime":"2025-10-06T14:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.336203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.336265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.336291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.336321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.336395 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:27Z","lastTransitionTime":"2025-10-06T14:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.439669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.439736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.439758 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.439788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.439811 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:27Z","lastTransitionTime":"2025-10-06T14:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.542854 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.542922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.542933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.542952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.542964 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:27Z","lastTransitionTime":"2025-10-06T14:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.574461 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.574498 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:27 crc kubenswrapper[4763]: E1006 14:54:27.574638 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:27 crc kubenswrapper[4763]: E1006 14:54:27.574773 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.645407 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.645463 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.645478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.645498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.645514 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:27Z","lastTransitionTime":"2025-10-06T14:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.748898 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.748952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.748967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.748992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.749008 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:27Z","lastTransitionTime":"2025-10-06T14:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.851325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.851380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.851390 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.851415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.851428 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:27Z","lastTransitionTime":"2025-10-06T14:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.955406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.955459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.955498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.955670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:27 crc kubenswrapper[4763]: I1006 14:54:27.955681 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:27Z","lastTransitionTime":"2025-10-06T14:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.058723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.059181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.059371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.059529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.059735 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:28Z","lastTransitionTime":"2025-10-06T14:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.163082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.163120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.163128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.163147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.163157 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:28Z","lastTransitionTime":"2025-10-06T14:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.265666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.265702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.265714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.265730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.265741 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:28Z","lastTransitionTime":"2025-10-06T14:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.368371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.368429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.368438 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.368454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.368464 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:28Z","lastTransitionTime":"2025-10-06T14:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.471728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.471773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.471790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.471816 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.471833 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:28Z","lastTransitionTime":"2025-10-06T14:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.573879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.573918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.573927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.573943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.573952 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:28Z","lastTransitionTime":"2025-10-06T14:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.574026 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.574154 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:28 crc kubenswrapper[4763]: E1006 14:54:28.574233 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:28 crc kubenswrapper[4763]: E1006 14:54:28.574319 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.676121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.676148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.676156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.676168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.676176 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:28Z","lastTransitionTime":"2025-10-06T14:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.778882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.778935 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.778952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.778975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.778991 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:28Z","lastTransitionTime":"2025-10-06T14:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.881601 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.881666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.881685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.881711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.881729 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:28Z","lastTransitionTime":"2025-10-06T14:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.984828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.984911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.984934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.984967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:28 crc kubenswrapper[4763]: I1006 14:54:28.984990 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:28Z","lastTransitionTime":"2025-10-06T14:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.087353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.087387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.087395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.087412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.087421 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:29Z","lastTransitionTime":"2025-10-06T14:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.189539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.189588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.189600 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.189634 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.189646 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:29Z","lastTransitionTime":"2025-10-06T14:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.291998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.292037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.292049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.292066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.292078 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:29Z","lastTransitionTime":"2025-10-06T14:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.395011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.395058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.395069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.395087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.395099 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:29Z","lastTransitionTime":"2025-10-06T14:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.497743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.497787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.497799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.497819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.497831 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:29Z","lastTransitionTime":"2025-10-06T14:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.574811 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.574905 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:29 crc kubenswrapper[4763]: E1006 14:54:29.574989 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:29 crc kubenswrapper[4763]: E1006 14:54:29.575078 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.599243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.599292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.599306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.599327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.599342 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:29Z","lastTransitionTime":"2025-10-06T14:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.701676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.701703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.701710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.701725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.701734 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:29Z","lastTransitionTime":"2025-10-06T14:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.804546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.804583 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.804595 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.804631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.804644 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:29Z","lastTransitionTime":"2025-10-06T14:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.907542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.907573 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.907581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.907595 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:29 crc kubenswrapper[4763]: I1006 14:54:29.907602 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:29Z","lastTransitionTime":"2025-10-06T14:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.010279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.010313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.010321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.010335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.010346 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:30Z","lastTransitionTime":"2025-10-06T14:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.113112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.113152 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.113162 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.113177 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.113187 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:30Z","lastTransitionTime":"2025-10-06T14:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.215667 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.215707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.215718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.215734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.215746 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:30Z","lastTransitionTime":"2025-10-06T14:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.318862 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.318917 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.318929 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.318948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.318960 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:30Z","lastTransitionTime":"2025-10-06T14:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.420794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.420843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.420860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.420886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.420918 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:30Z","lastTransitionTime":"2025-10-06T14:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.524272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.524333 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.524357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.524384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.524405 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:30Z","lastTransitionTime":"2025-10-06T14:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.573956 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.573958 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:30 crc kubenswrapper[4763]: E1006 14:54:30.574255 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:30 crc kubenswrapper[4763]: E1006 14:54:30.574109 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.627856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.627904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.627920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.627945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.627962 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:30Z","lastTransitionTime":"2025-10-06T14:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.734584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.734660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.734678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.734699 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.734714 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:30Z","lastTransitionTime":"2025-10-06T14:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.840759 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.840837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.840857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.840882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.840900 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:30Z","lastTransitionTime":"2025-10-06T14:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.943629 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.943671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.943682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.943698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:30 crc kubenswrapper[4763]: I1006 14:54:30.943708 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:30Z","lastTransitionTime":"2025-10-06T14:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.045776 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.045825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.045841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.045865 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.045881 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:31Z","lastTransitionTime":"2025-10-06T14:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.147782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.147834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.147851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.147871 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.147887 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:31Z","lastTransitionTime":"2025-10-06T14:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.251209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.251240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.251275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.251289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.251316 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:31Z","lastTransitionTime":"2025-10-06T14:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.353447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.353490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.353500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.353517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.353530 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:31Z","lastTransitionTime":"2025-10-06T14:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.456168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.456204 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.456217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.456235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.456246 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:31Z","lastTransitionTime":"2025-10-06T14:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.558288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.558590 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.558681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.558794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.558875 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:31Z","lastTransitionTime":"2025-10-06T14:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.574645 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.574652 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:31 crc kubenswrapper[4763]: E1006 14:54:31.574927 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:31 crc kubenswrapper[4763]: E1006 14:54:31.574953 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.575649 4763 scope.go:117] "RemoveContainer" containerID="f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0" Oct 06 14:54:31 crc kubenswrapper[4763]: E1006 14:54:31.575811 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.662006 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.662066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.662084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.662110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.662128 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:31Z","lastTransitionTime":"2025-10-06T14:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.765319 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.765351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.765359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.765373 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.765381 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:31Z","lastTransitionTime":"2025-10-06T14:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.868196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.868421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.868524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.868604 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.868714 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:31Z","lastTransitionTime":"2025-10-06T14:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.971633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.971894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.971967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.972034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:31 crc kubenswrapper[4763]: I1006 14:54:31.972102 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:31Z","lastTransitionTime":"2025-10-06T14:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.039757 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:32 crc kubenswrapper[4763]: E1006 14:54:32.039985 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:32 crc kubenswrapper[4763]: E1006 14:54:32.040082 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs podName:d6aeb0e7-db42-449d-8052-fc68154e93d2 nodeName:}" failed. No retries permitted until 2025-10-06 14:55:04.040059785 +0000 UTC m=+101.195352357 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs") pod "network-metrics-daemon-hgd8l" (UID: "d6aeb0e7-db42-449d-8052-fc68154e93d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.074605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.074660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.074676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.074693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.074704 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:32Z","lastTransitionTime":"2025-10-06T14:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.177056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.177101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.177113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.177130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.177141 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:32Z","lastTransitionTime":"2025-10-06T14:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.279980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.280015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.280026 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.280042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.280053 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:32Z","lastTransitionTime":"2025-10-06T14:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.383166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.383224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.383241 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.383263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.383279 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:32Z","lastTransitionTime":"2025-10-06T14:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.485873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.485907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.485919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.485933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.485942 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:32Z","lastTransitionTime":"2025-10-06T14:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.574830 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.574903 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:32 crc kubenswrapper[4763]: E1006 14:54:32.574963 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:32 crc kubenswrapper[4763]: E1006 14:54:32.575121 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.588602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.588739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.588767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.588800 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.588823 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:32Z","lastTransitionTime":"2025-10-06T14:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.691476 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.691539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.691558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.691582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.691599 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:32Z","lastTransitionTime":"2025-10-06T14:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.794503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.794555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.794571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.794593 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.794611 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:32Z","lastTransitionTime":"2025-10-06T14:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.897961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.898019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.898038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.898062 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:32 crc kubenswrapper[4763]: I1006 14:54:32.898079 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:32Z","lastTransitionTime":"2025-10-06T14:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.001066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.001102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.001110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.001124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.001133 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:33Z","lastTransitionTime":"2025-10-06T14:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.011693 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/0.log" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.011729 4763 generic.go:334] "Generic (PLEG): container finished" podID="22f7ff70-c0ad-406d-aa9d-6824cb935c66" containerID="6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f" exitCode=1 Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.011752 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bj6z5" event={"ID":"22f7ff70-c0ad-406d-aa9d-6824cb935c66","Type":"ContainerDied","Data":"6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.012032 4763 scope.go:117] "RemoveContainer" containerID="6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.025533 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.039955 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.052164 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.062054 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.077057 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.093483 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.104225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.104266 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.104276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.104291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.104302 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:33Z","lastTransitionTime":"2025-10-06T14:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.109661 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.124366 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.139432 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.159588 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:18Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657116 6508 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657526 6508 obj_retry.go:551] Creating *factory.egressNode crc took: 2.946146ms\\\\nI1006 14:54:18.657564 6508 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 14:54:18.657637 6508 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 14:54:18.657987 6508 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 14:54:18.658168 6508 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 14:54:18.658216 6508 ovnkube.go:599] Stopped ovnkube\\\\nI1006 14:54:18.658246 6508 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 14:54:18.658335 6508 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.171538 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.183251 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.196479 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.207356 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.207416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.207433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.207456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.207473 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:33Z","lastTransitionTime":"2025-10-06T14:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.211793 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.224552 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.236456 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:32Z\\\",\\\"message\\\":\\\"2025-10-06T14:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb\\\\n2025-10-06T14:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb to /host/opt/cni/bin/\\\\n2025-10-06T14:53:47Z [verbose] multus-daemon started\\\\n2025-10-06T14:53:47Z [verbose] Readiness Indicator file check\\\\n2025-10-06T14:54:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.248788 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.310916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.310987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.311010 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.311039 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.311061 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:33Z","lastTransitionTime":"2025-10-06T14:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.414185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.414234 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.414246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.414267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.414283 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:33Z","lastTransitionTime":"2025-10-06T14:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.517568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.517596 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.517604 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.517631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.517640 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:33Z","lastTransitionTime":"2025-10-06T14:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.574738 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.574956 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:33 crc kubenswrapper[4763]: E1006 14:54:33.575095 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:33 crc kubenswrapper[4763]: E1006 14:54:33.575369 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.588814 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.601850 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:32Z\\\",\\\"message\\\":\\\"2025-10-06T14:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb\\\\n2025-10-06T14:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb to /host/opt/cni/bin/\\\\n2025-10-06T14:53:47Z [verbose] multus-daemon started\\\\n2025-10-06T14:53:47Z [verbose] Readiness Indicator file check\\\\n2025-10-06T14:54:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.620302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.620343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.620357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.620378 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.620393 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:33Z","lastTransitionTime":"2025-10-06T14:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.622865 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:18Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657116 6508 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657526 6508 obj_retry.go:551] Creating *factory.egressNode crc took: 2.946146ms\\\\nI1006 14:54:18.657564 6508 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 14:54:18.657637 6508 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 14:54:18.657987 6508 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 14:54:18.658168 6508 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 14:54:18.658216 6508 ovnkube.go:599] Stopped ovnkube\\\\nI1006 14:54:18.658246 6508 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 14:54:18.658335 6508 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.634810 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.646247 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.660303 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.672604 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.683174 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.695571 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.706382 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.717334 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.723009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.723047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.723055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.723069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.723078 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:33Z","lastTransitionTime":"2025-10-06T14:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.728226 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.738776 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.750673 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.759702 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.772924 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.786300 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:33Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.825219 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.825323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.825341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.825364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.825381 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:33Z","lastTransitionTime":"2025-10-06T14:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.929192 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.929246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.929254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.929269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:33 crc kubenswrapper[4763]: I1006 14:54:33.929279 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:33Z","lastTransitionTime":"2025-10-06T14:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.017182 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/0.log" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.017236 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bj6z5" event={"ID":"22f7ff70-c0ad-406d-aa9d-6824cb935c66","Type":"ContainerStarted","Data":"ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.029724 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.030968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.031003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.031014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.031029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.031040 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:34Z","lastTransitionTime":"2025-10-06T14:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.047052 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.058497 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.076316 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.089203 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.099452 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.110659 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:32Z\\\",\\\"message\\\":\\\"2025-10-06T14:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb\\\\n2025-10-06T14:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb to /host/opt/cni/bin/\\\\n2025-10-06T14:53:47Z [verbose] multus-daemon started\\\\n2025-10-06T14:53:47Z [verbose] Readiness Indicator file check\\\\n2025-10-06T14:54:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.126900 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:18Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657116 6508 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657526 6508 obj_retry.go:551] Creating *factory.egressNode crc took: 2.946146ms\\\\nI1006 14:54:18.657564 6508 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 14:54:18.657637 6508 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 14:54:18.657987 6508 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 14:54:18.658168 6508 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 14:54:18.658216 6508 ovnkube.go:599] Stopped ovnkube\\\\nI1006 14:54:18.658246 6508 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 14:54:18.658335 6508 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.133943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.133977 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.133985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.133998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.134007 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:34Z","lastTransitionTime":"2025-10-06T14:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.136319 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.146497 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.161231 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.171816 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.180267 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.194187 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.206449 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.216322 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.229847 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:34Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.236594 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.236641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.236652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.236669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.236683 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:34Z","lastTransitionTime":"2025-10-06T14:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.338819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.338881 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.338893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.338908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.338919 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:34Z","lastTransitionTime":"2025-10-06T14:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.441238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.441276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.441287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.441301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.441314 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:34Z","lastTransitionTime":"2025-10-06T14:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.543278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.543316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.543327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.543342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.543353 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:34Z","lastTransitionTime":"2025-10-06T14:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.574591 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.574677 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:34 crc kubenswrapper[4763]: E1006 14:54:34.574700 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:34 crc kubenswrapper[4763]: E1006 14:54:34.574790 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.646008 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.646067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.646089 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.646118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.646141 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:34Z","lastTransitionTime":"2025-10-06T14:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.755950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.756025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.756046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.756071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.756097 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:34Z","lastTransitionTime":"2025-10-06T14:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.859382 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.859490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.859510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.860121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.860430 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:34Z","lastTransitionTime":"2025-10-06T14:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.964154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.964226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.964249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.964273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:34 crc kubenswrapper[4763]: I1006 14:54:34.964297 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:34Z","lastTransitionTime":"2025-10-06T14:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.067319 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.067362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.067397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.067414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.067424 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.171572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.171694 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.171717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.171741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.171757 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.275031 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.275224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.275264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.275294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.275316 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.378922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.378997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.379018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.379046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.379070 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.380515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.380576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.380599 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.380841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.380865 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: E1006 14:54:35.394555 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:35Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.400285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.400359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.400383 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.400412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.400434 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: E1006 14:54:35.416989 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:35Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.421034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.421091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.421111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.421135 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.421152 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: E1006 14:54:35.432858 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:35Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.437048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.437100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.437120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.437142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.437159 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: E1006 14:54:35.448073 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:35Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.451859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.451892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.451903 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.451920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.451932 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: E1006 14:54:35.464432 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:35Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:35 crc kubenswrapper[4763]: E1006 14:54:35.464572 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.481446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.481477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.481484 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.481498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.481508 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.574039 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.574060 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:35 crc kubenswrapper[4763]: E1006 14:54:35.574178 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:35 crc kubenswrapper[4763]: E1006 14:54:35.574334 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.583160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.583211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.583234 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.583258 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.583275 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.685394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.685449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.685465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.685490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.685539 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.787709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.787757 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.787773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.787791 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.787806 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.891025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.891068 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.891080 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.891097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.891108 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.993592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.993725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.993751 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.993780 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:35 crc kubenswrapper[4763]: I1006 14:54:35.993804 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:35Z","lastTransitionTime":"2025-10-06T14:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.096186 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.096242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.096255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.096276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.096292 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:36Z","lastTransitionTime":"2025-10-06T14:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.198525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.198579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.198591 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.198607 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.198651 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:36Z","lastTransitionTime":"2025-10-06T14:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.301699 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.301766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.301788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.301825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.301880 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:36Z","lastTransitionTime":"2025-10-06T14:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.403488 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.403542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.403559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.403579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.403596 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:36Z","lastTransitionTime":"2025-10-06T14:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.505993 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.506059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.506076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.506100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.506117 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:36Z","lastTransitionTime":"2025-10-06T14:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.574160 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:36 crc kubenswrapper[4763]: E1006 14:54:36.574272 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.574430 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:36 crc kubenswrapper[4763]: E1006 14:54:36.574472 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.607956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.607979 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.607987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.607997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.608005 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:36Z","lastTransitionTime":"2025-10-06T14:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.709551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.709581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.709589 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.709601 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.709609 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:36Z","lastTransitionTime":"2025-10-06T14:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.811918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.811972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.811985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.812004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.812018 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:36Z","lastTransitionTime":"2025-10-06T14:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.914182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.914244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.914260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.914285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:36 crc kubenswrapper[4763]: I1006 14:54:36.914306 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:36Z","lastTransitionTime":"2025-10-06T14:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.016883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.016927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.016937 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.016952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.016964 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:37Z","lastTransitionTime":"2025-10-06T14:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.119507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.119549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.119561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.119575 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.119584 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:37Z","lastTransitionTime":"2025-10-06T14:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.223097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.223160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.223176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.223202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.223221 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:37Z","lastTransitionTime":"2025-10-06T14:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.325498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.325537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.325546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.325559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.325567 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:37Z","lastTransitionTime":"2025-10-06T14:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.428234 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.428299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.428321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.428355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.428377 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:37Z","lastTransitionTime":"2025-10-06T14:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.531966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.532045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.532069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.532101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.532123 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:37Z","lastTransitionTime":"2025-10-06T14:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.574983 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.575044 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:37 crc kubenswrapper[4763]: E1006 14:54:37.575215 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:37 crc kubenswrapper[4763]: E1006 14:54:37.575561 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.591111 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.635645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.635702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.635720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.635745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.635762 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:37Z","lastTransitionTime":"2025-10-06T14:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.738255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.738335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.738361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.738392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.738415 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:37Z","lastTransitionTime":"2025-10-06T14:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.840681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.840728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.840741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.840756 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.840767 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:37Z","lastTransitionTime":"2025-10-06T14:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.943208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.943269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.943280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.943296 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:37 crc kubenswrapper[4763]: I1006 14:54:37.943308 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:37Z","lastTransitionTime":"2025-10-06T14:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.045759 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.045809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.045818 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.045832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.045842 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:38Z","lastTransitionTime":"2025-10-06T14:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.149296 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.149356 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.149372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.149402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.149419 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:38Z","lastTransitionTime":"2025-10-06T14:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.253003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.253070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.253090 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.253116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.253138 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:38Z","lastTransitionTime":"2025-10-06T14:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.356819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.356887 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.356909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.356934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.356951 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:38Z","lastTransitionTime":"2025-10-06T14:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.460385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.460450 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.460475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.460501 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.460518 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:38Z","lastTransitionTime":"2025-10-06T14:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.562940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.563011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.563062 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.563093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.563115 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:38Z","lastTransitionTime":"2025-10-06T14:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.574493 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.574493 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:38 crc kubenswrapper[4763]: E1006 14:54:38.574683 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:38 crc kubenswrapper[4763]: E1006 14:54:38.574785 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.665953 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.666020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.666032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.666050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.666065 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:38Z","lastTransitionTime":"2025-10-06T14:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.768890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.768937 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.768948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.768964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.768976 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:38Z","lastTransitionTime":"2025-10-06T14:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.871377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.871446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.871466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.871489 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.871510 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:38Z","lastTransitionTime":"2025-10-06T14:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.974218 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.974290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.974312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.974340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:38 crc kubenswrapper[4763]: I1006 14:54:38.974362 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:38Z","lastTransitionTime":"2025-10-06T14:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.077308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.077368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.077386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.077409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.077427 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:39Z","lastTransitionTime":"2025-10-06T14:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.179825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.179904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.179927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.179959 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.179980 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:39Z","lastTransitionTime":"2025-10-06T14:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.282406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.282446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.282457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.282471 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.282482 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:39Z","lastTransitionTime":"2025-10-06T14:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.385380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.385440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.385459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.385486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.385507 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:39Z","lastTransitionTime":"2025-10-06T14:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.488576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.488706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.488733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.488764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.488785 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:39Z","lastTransitionTime":"2025-10-06T14:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.574476 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.574510 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:39 crc kubenswrapper[4763]: E1006 14:54:39.574796 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:39 crc kubenswrapper[4763]: E1006 14:54:39.574957 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.591670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.591761 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.591778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.591799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.591817 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:39Z","lastTransitionTime":"2025-10-06T14:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.695019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.695080 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.695097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.695121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.695139 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:39Z","lastTransitionTime":"2025-10-06T14:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.797946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.798015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.798031 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.798059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.798076 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:39Z","lastTransitionTime":"2025-10-06T14:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.901197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.901275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.901300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.901329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:39 crc kubenswrapper[4763]: I1006 14:54:39.901352 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:39Z","lastTransitionTime":"2025-10-06T14:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.004571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.004672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.004692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.004715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.004731 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:40Z","lastTransitionTime":"2025-10-06T14:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.107339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.107390 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.107409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.107433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.107450 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:40Z","lastTransitionTime":"2025-10-06T14:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.209923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.209963 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.209974 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.209990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.210002 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:40Z","lastTransitionTime":"2025-10-06T14:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.313316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.313478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.313510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.313540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.313562 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:40Z","lastTransitionTime":"2025-10-06T14:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.416294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.416368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.416381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.416400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.416413 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:40Z","lastTransitionTime":"2025-10-06T14:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.520361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.520428 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.520447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.520470 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.520488 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:40Z","lastTransitionTime":"2025-10-06T14:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.574464 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.574492 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:40 crc kubenswrapper[4763]: E1006 14:54:40.574697 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:40 crc kubenswrapper[4763]: E1006 14:54:40.574874 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.624175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.624246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.624264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.624289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.624318 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:40Z","lastTransitionTime":"2025-10-06T14:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.727457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.727526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.727543 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.727568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.727586 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:40Z","lastTransitionTime":"2025-10-06T14:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.831257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.831349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.831373 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.831401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.831422 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:40Z","lastTransitionTime":"2025-10-06T14:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.935424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.935485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.935506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.935535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:40 crc kubenswrapper[4763]: I1006 14:54:40.935556 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:40Z","lastTransitionTime":"2025-10-06T14:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.038714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.038775 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.038791 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.038815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.038838 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:41Z","lastTransitionTime":"2025-10-06T14:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.142259 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.142327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.142349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.142372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.142389 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:41Z","lastTransitionTime":"2025-10-06T14:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.246331 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.246467 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.246493 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.246521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.246539 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:41Z","lastTransitionTime":"2025-10-06T14:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.349199 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.349247 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.349263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.349288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.349307 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:41Z","lastTransitionTime":"2025-10-06T14:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.453233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.453294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.453310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.453339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.453357 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:41Z","lastTransitionTime":"2025-10-06T14:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.556535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.556602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.556720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.556753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.556778 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:41Z","lastTransitionTime":"2025-10-06T14:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.574933 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.574977 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:41 crc kubenswrapper[4763]: E1006 14:54:41.575151 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:41 crc kubenswrapper[4763]: E1006 14:54:41.575301 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.660083 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.660138 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.660150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.660168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.660180 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:41Z","lastTransitionTime":"2025-10-06T14:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.763515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.763579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.763605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.763682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.763706 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:41Z","lastTransitionTime":"2025-10-06T14:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.866679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.866728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.866744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.866768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.866785 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:41Z","lastTransitionTime":"2025-10-06T14:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.970009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.970056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.970069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.970085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:41 crc kubenswrapper[4763]: I1006 14:54:41.970097 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:41Z","lastTransitionTime":"2025-10-06T14:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.073025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.073153 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.073209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.073235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.073252 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:42Z","lastTransitionTime":"2025-10-06T14:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.175982 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.176043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.176061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.176087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.176104 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:42Z","lastTransitionTime":"2025-10-06T14:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.279259 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.279327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.279348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.279375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.279397 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:42Z","lastTransitionTime":"2025-10-06T14:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.382164 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.382227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.382250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.382327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.382388 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:42Z","lastTransitionTime":"2025-10-06T14:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.485504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.485544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.485555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.485571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.485584 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:42Z","lastTransitionTime":"2025-10-06T14:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.574598 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:42 crc kubenswrapper[4763]: E1006 14:54:42.574721 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.574602 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:42 crc kubenswrapper[4763]: E1006 14:54:42.574927 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.589000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.589063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.589079 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.589106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.589123 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:42Z","lastTransitionTime":"2025-10-06T14:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.692027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.692113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.692139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.692173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.692197 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:42Z","lastTransitionTime":"2025-10-06T14:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.795186 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.795257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.795280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.795307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.795328 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:42Z","lastTransitionTime":"2025-10-06T14:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.898293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.898351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.898368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.898392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:42 crc kubenswrapper[4763]: I1006 14:54:42.898411 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:42Z","lastTransitionTime":"2025-10-06T14:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.001474 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.001520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.001530 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.001550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.001566 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:43Z","lastTransitionTime":"2025-10-06T14:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.104529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.104581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.104593 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.104647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.104669 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:43Z","lastTransitionTime":"2025-10-06T14:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.206951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.207014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.207033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.207058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.207076 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:43Z","lastTransitionTime":"2025-10-06T14:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.310275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.310331 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.310343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.310363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.310377 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:43Z","lastTransitionTime":"2025-10-06T14:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.412826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.412869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.412877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.412892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.412903 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:43Z","lastTransitionTime":"2025-10-06T14:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.515912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.515960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.515974 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.515996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.516008 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:43Z","lastTransitionTime":"2025-10-06T14:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.575037 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.575555 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:43 crc kubenswrapper[4763]: E1006 14:54:43.575885 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:43 crc kubenswrapper[4763]: E1006 14:54:43.576017 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.576130 4763 scope.go:117] "RemoveContainer" containerID="f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.593021 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.609990 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.621042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.621087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.621097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.621114 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.621125 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:43Z","lastTransitionTime":"2025-10-06T14:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.637308 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.656939 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.673901 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:32Z\\\",\\\"message\\\":\\\"2025-10-06T14:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb\\\\n2025-10-06T14:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb to /host/opt/cni/bin/\\\\n2025-10-06T14:53:47Z [verbose] multus-daemon started\\\\n2025-10-06T14:53:47Z [verbose] Readiness Indicator file check\\\\n2025-10-06T14:54:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.702228 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:18Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657116 6508 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657526 6508 obj_retry.go:551] Creating *factory.egressNode crc took: 2.946146ms\\\\nI1006 14:54:18.657564 6508 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 14:54:18.657637 6508 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 14:54:18.657987 6508 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 14:54:18.658168 6508 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 14:54:18.658216 6508 ovnkube.go:599] Stopped ovnkube\\\\nI1006 14:54:18.658246 6508 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 14:54:18.658335 6508 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.718399 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.724987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.725038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.725053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.725073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.725086 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:43Z","lastTransitionTime":"2025-10-06T14:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.735743 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.749551 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.764498 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.781759 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.794366 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.808799 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"784e40b2-ca2b-4a8c-87ff-14b657b5a19f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f4c0f9470910dd15e4b6e688f798068848b21eef74b55a3c77eeaa651a9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c13b15d452e564c88a9934cc87bbd0ff363c03df8e12bfb48b3761a89e8bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2c13b15d452e564c88a9934cc87bbd0ff363c03df8e12bfb48b3761a89e8bfd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.826480 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.828462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.828508 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.828524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.828544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.828559 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:43Z","lastTransitionTime":"2025-10-06T14:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.841297 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.860262 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.870238 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.892672 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:43Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.932158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.932208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.932219 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.932235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:43 crc kubenswrapper[4763]: I1006 14:54:43.932247 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:43Z","lastTransitionTime":"2025-10-06T14:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.034995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.035048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.035064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.035088 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.035105 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:44Z","lastTransitionTime":"2025-10-06T14:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.052133 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/2.log" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.055457 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.056074 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.075945 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.100212 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.118309 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.131170 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.137597 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.137652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.137664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.137683 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.137698 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:44Z","lastTransitionTime":"2025-10-06T14:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.147673 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.159064 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.172064 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.183462 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"784e40b2-ca2b-4a8c-87ff-14b657b5a19f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f4c0f9470910dd15e4b6e688f798068848b21eef74b55a3c77eeaa651a9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c13b15d452e564c88a9934cc87bbd0ff363c03df8e12bfb48b3761a89e8bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2c13b15d452e564c88a9934cc87bbd0ff363c03df8e12bfb48b3761a89e8bfd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.198973 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.216734 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.232021 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.240211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.240237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.240245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.240258 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.240266 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:44Z","lastTransitionTime":"2025-10-06T14:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.266682 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:18Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657116 6508 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657526 6508 obj_retry.go:551] Creating *factory.egressNode crc took: 2.946146ms\\\\nI1006 14:54:18.657564 6508 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 14:54:18.657637 6508 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 14:54:18.657987 6508 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 14:54:18.658168 6508 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 14:54:18.658216 6508 ovnkube.go:599] Stopped ovnkube\\\\nI1006 14:54:18.658246 6508 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 14:54:18.658335 6508 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.277324 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.288186 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.300337 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.311035 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.321219 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.335899 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:32Z\\\",\\\"message\\\":\\\"2025-10-06T14:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb\\\\n2025-10-06T14:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb to /host/opt/cni/bin/\\\\n2025-10-06T14:53:47Z [verbose] multus-daemon started\\\\n2025-10-06T14:53:47Z [verbose] Readiness Indicator file check\\\\n2025-10-06T14:54:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:44Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.342682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.342754 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.342772 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.342795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.342812 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:44Z","lastTransitionTime":"2025-10-06T14:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.446801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.446862 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.446879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.446905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.446926 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:44Z","lastTransitionTime":"2025-10-06T14:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.550121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.550192 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.550255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.550278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.550296 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:44Z","lastTransitionTime":"2025-10-06T14:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.574853 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.574853 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:44 crc kubenswrapper[4763]: E1006 14:54:44.575090 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:44 crc kubenswrapper[4763]: E1006 14:54:44.575164 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.653363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.653481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.653499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.653522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.653539 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:44Z","lastTransitionTime":"2025-10-06T14:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.756821 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.756878 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.756897 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.756920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.756937 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:44Z","lastTransitionTime":"2025-10-06T14:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.860345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.860400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.860418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.860441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.860457 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:44Z","lastTransitionTime":"2025-10-06T14:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.963782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.963831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.963849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.963871 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:44 crc kubenswrapper[4763]: I1006 14:54:44.963890 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:44Z","lastTransitionTime":"2025-10-06T14:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.061499 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/3.log" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.062571 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/2.log" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.066014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.066075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.066098 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.066127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.066149 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.066888 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" exitCode=1 Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.066942 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.066984 4763 scope.go:117] "RemoveContainer" containerID="f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.067896 4763 scope.go:117] "RemoveContainer" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 14:54:45 crc kubenswrapper[4763]: E1006 14:54:45.068130 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.094783 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:32Z\\\",\\\"message\\\":\\\"2025-10-06T14:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb\\\\n2025-10-06T14:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb to /host/opt/cni/bin/\\\\n2025-10-06T14:53:47Z [verbose] multus-daemon started\\\\n2025-10-06T14:53:47Z [verbose] Readiness Indicator file check\\\\n2025-10-06T14:54:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.117840 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f978e05f3336904939f1cbb4bdd9f5b62bc7a023c76dd6175fb89e9294f632e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:18Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657116 6508 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 14:54:18.657526 6508 obj_retry.go:551] Creating *factory.egressNode crc took: 2.946146ms\\\\nI1006 14:54:18.657564 6508 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 14:54:18.657637 6508 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 14:54:18.657987 6508 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 14:54:18.658168 6508 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 14:54:18.658216 6508 ovnkube.go:599] Stopped ovnkube\\\\nI1006 14:54:18.658246 6508 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 14:54:18.658335 6508 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:44Z\\\",\\\"message\\\":\\\"fecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 14:54:44.583941 6879 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nF1006 14:54:44.584085 6879 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.131606 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.148264 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.169328 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.169972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.170021 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.170038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.170062 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.170230 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.190012 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.210991 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.228391 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.247929 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.268211 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.273181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.273226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.273243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.273265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.273282 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.288444 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.311038 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.334360 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.352066 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.376250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.376301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.376317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.376341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.376359 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.378229 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.396237 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"784e40b2-ca2b-4a8c-87ff-14b657b5a19f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f4c0f9470910dd15e4b6e688f798068848b21eef74b55a3c77eeaa651a9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c13b15d452e564c88a9934cc87bbd0ff363c03df8e12bfb48b3761a89e8bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2c13b15d452e564c88a9934cc87bbd0ff363c03df8e12bfb48b3761a89e8bfd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.430071 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.462825 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.478524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.478571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.478586 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.478607 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.478643 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.495352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.495392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.495404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.495422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.495433 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: E1006 14:54:45.509536 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.513661 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.513704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.513719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.513740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.513755 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: E1006 14:54:45.528499 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.532435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.532465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.532476 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.532492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.532503 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: E1006 14:54:45.546784 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.549909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.549942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.549975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.549988 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.549996 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: E1006 14:54:45.560656 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.564555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.564600 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.564645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.564670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.564684 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.574537 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:45 crc kubenswrapper[4763]: E1006 14:54:45.574817 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.574884 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:45 crc kubenswrapper[4763]: E1006 14:54:45.575081 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:45 crc kubenswrapper[4763]: E1006 14:54:45.580557 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a45ad59-aebd-449e-8dda-9594cfe75912\\\",\\\"systemUUID\\\":\\\"5648b82a-0ebd-488c-add6-0c62e287c376\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:45Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:45 crc kubenswrapper[4763]: E1006 14:54:45.580854 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.582962 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.583063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.583081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.583106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.583127 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.685565 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.685645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.685658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.685678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.685690 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.788366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.788429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.788451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.788483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.788507 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.891869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.891940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.891961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.891989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.892011 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.994765 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.994814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.994836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.994861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:45 crc kubenswrapper[4763]: I1006 14:54:45.994882 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:45Z","lastTransitionTime":"2025-10-06T14:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.073230 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/3.log" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.078687 4763 scope.go:117] "RemoveContainer" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 14:54:46 crc kubenswrapper[4763]: E1006 14:54:46.079055 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.097932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.097991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.098005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.098024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.098037 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:46Z","lastTransitionTime":"2025-10-06T14:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.102124 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1160ce9a-9fc0-4bf6-a092-c0b66d708d97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc4bce725bf9f4460d787de1bcea4a60ca9cdff7a82af8ce8d409d24c10bffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e4b6f9c86aa8074e47b45c3d7825ac865dd7164f6b9df39693af786ded573b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd83030d1eaa2bcfeb742df6197afdb81e1109ee3eaf01ea86eba1a03c4118c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e5cea0562d8d3031f86afcc4a3bec3aee2ffe9d9fe30f2811aaeaca5b7f5639\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f79992b5d228e5ece96200c1a968e8fc82b4e95f1bd3a47c3bcd955afedbca8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1006 14:53:37.116449 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 14:53:37.119341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2754867239/tls.crt::/tmp/serving-cert-2754867239/tls.key\\\\\\\"\\\\nI1006 14:53:43.889838 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 14:53:43.901238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 14:53:43.901280 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 14:53:43.901314 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 14:53:43.901327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 14:53:43.928305 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1006 14:53:43.928384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 14:53:43.928422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 14:53:43.928544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 14:53:43.928566 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 14:53:43.928587 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 14:53:43.928608 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 14:53:43.931371 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da263f0a77f9c1f65bfbfccd88c6d857e609b535219080ec9897c5f9840cce8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5f71b43cae9e6fc0bf5fa1e1cad3df68c7b6bb3437b0df7490cf3d73500ca7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.116568 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f1fc9-39e2-4f90-a977-a6d12117a134\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eed0e448e3daa125cea168152cee9e5ffb3a53ed2abf0372d5a021e048978b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ca6ffc441c35a3216c8bfe6004fdd702209a3fe97a05c6eb4d32f9d279f573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be6d55916911c73990b3ac4372f4a255e2f7d0e6e6f638a08409e2852e0539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7984709f071bda458431e17667f50ce7d55d57fcf1091c2085e4027ca6d090e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.129848 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8338ceb8acc62ca0e735f6095e1147d69fbf24c366a3f7a40e78aae4cbaf211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.144794 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gqz6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ac212b7-747d-41ba-9f91-e79c223fb17f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfb0a7ebc87672b3c699647ac91c6a81c83ad89d8d5714a5c446926a4c5f3414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csgrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gqz6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.166107 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f86111c-14e3-4725-b3cf-b62a3b711813\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7d87d768687237bc31b2881e15fab52813346726f97798bf2851f8eab10e46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a68630d051ed7f7834b3a75f03b3fba4a67ae67c243431d0880c904f345bdc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb99ce6036f431d4b25852b532ddc2a102a344f37f02adc45769181b8aa6025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0b5faddcd2ae6e2909b1beb803976bbab1af25f932e3f145e0f117d4180acaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d30a10d643102c3beaa020bad22c8b39e013d1519f2c07da69e141233131ba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://820a36baf27f8216ec58295931acc5e85290982066551a989a6c849249805d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b7a1ff6ef046f1fc507b91bb59f02b099a1a9eef002a6a2b4b7260a6d0bf9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzrc6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.177280 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"784e40b2-ca2b-4a8c-87ff-14b657b5a19f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f4c0f9470910dd15e4b6e688f798068848b21eef74b55a3c77eeaa651a9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c13b15d452e564c88a9934cc87bbd0ff363c03df8e12bfb48b3761a89e8bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2c13b15d452e564c88a9934cc87bbd0ff363c03df8e12bfb48b3761a89e8bfd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.194667 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.199822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.199868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.199879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.199898 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.199911 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:46Z","lastTransitionTime":"2025-10-06T14:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.209373 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.224170 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bj6z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f7ff70-c0ad-406d-aa9d-6824cb935c66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:32Z\\\",\\\"message\\\":\\\"2025-10-06T14:53:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb\\\\n2025-10-06T14:53:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_faf8d620-4174-4e9c-887e-dab05b769dbb to /host/opt/cni/bin/\\\\n2025-10-06T14:53:47Z [verbose] multus-daemon started\\\\n2025-10-06T14:53:47Z [verbose] Readiness Indicator file check\\\\n2025-10-06T14:54:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b58zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bj6z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.255435 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T14:54:44Z\\\",\\\"message\\\":\\\"fecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 14:54:44.583941 6879 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nF1006 14:54:44.584085 6879 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T14:54:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljzf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jnftg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.269877 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j9x9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a002372a-0206-4c41-9e46-0491543d1d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e10a934957db813adac53514a1d5f635e49af9a6ccac50fe7842c306f7e0960c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqgr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j9x9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.284188 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a93bc27-2eb6-438b-bc9c-cab665f898f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ac1cbe5570fba0f3b5098a76684a74b34eee8b2dc21cc600f6dc314894cb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46698d8eda5b79a9a6a47604ec24c10db160f278d3e676ad70e9a09c8b392a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j4tbd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tzrds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.302012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.302062 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.302075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.302093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.302105 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:46Z","lastTransitionTime":"2025-10-06T14:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.304518 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8f3640-c814-42bd-b0a6-d0e96895fbcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab01258aa46ef703dd4efaa533bd96a0aeade4d26c009f10217061e75e774baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b518441f09e0a8e3ecaa13f0b970d0776708c53a3822d52911518636b69db76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f717f931bb6404d4874248a755a92e28a8157b70ba23b05062ef81a0e99d3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0443b0bd6ce34c77801b43d33fefff4ca78487d8c4f5e46eaa2d20a52169646d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.322456 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6aeb0e7-db42-449d-8052-fc68154e93d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9g54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgd8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.341605 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.359555 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045a995704c586526ed5065f2292a2e145297af8f46540ba237940c6d68f7734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c12e5d447f540a6baa33e1dabcca7032aa2f62a9622036998488798c74bfb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.376844 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c91c0c6-f031-4840-bc66-ab38e8fb67c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2134c1f91a50b80f8efae9f559c43ae012a4d20f34f8273474b6d9036ce9ee24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdw7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T14:53:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9g2sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.395954 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T14:53:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9502bd300ee0616368b4f83b6697fb76dca61d67346bc5b56d6428667ac003a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T14:54:46Z is after 2025-08-24T17:21:41Z" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.404298 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.404350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.404361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.404381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.404394 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:46Z","lastTransitionTime":"2025-10-06T14:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.507112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.507190 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.507212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.507280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.507306 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:46Z","lastTransitionTime":"2025-10-06T14:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.574196 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.574300 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:46 crc kubenswrapper[4763]: E1006 14:54:46.574352 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:46 crc kubenswrapper[4763]: E1006 14:54:46.574506 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.610715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.610788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.610811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.610845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.610867 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:46Z","lastTransitionTime":"2025-10-06T14:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.713716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.713765 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.713784 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.713808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.713825 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:46Z","lastTransitionTime":"2025-10-06T14:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.816968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.817021 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.817038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.817061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.817077 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:46Z","lastTransitionTime":"2025-10-06T14:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.919813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.919875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.919894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.919919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:46 crc kubenswrapper[4763]: I1006 14:54:46.919937 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:46Z","lastTransitionTime":"2025-10-06T14:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.023082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.023142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.023158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.023183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.023203 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:47Z","lastTransitionTime":"2025-10-06T14:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.126082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.126153 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.126185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.126214 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.126237 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:47Z","lastTransitionTime":"2025-10-06T14:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.229744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.229787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.229798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.229814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.229825 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:47Z","lastTransitionTime":"2025-10-06T14:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.332767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.333023 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.333142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.333269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.333490 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:47Z","lastTransitionTime":"2025-10-06T14:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.436939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.437198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.437285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.437363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.437434 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:47Z","lastTransitionTime":"2025-10-06T14:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.541149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.541186 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.541197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.541211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.541220 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:47Z","lastTransitionTime":"2025-10-06T14:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.575064 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:47 crc kubenswrapper[4763]: E1006 14:54:47.575241 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.575284 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:47 crc kubenswrapper[4763]: E1006 14:54:47.575472 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.644070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.644111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.644121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.644137 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.644148 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:47Z","lastTransitionTime":"2025-10-06T14:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.746318 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.746361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.746372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.746389 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.746401 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:47Z","lastTransitionTime":"2025-10-06T14:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.849572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.849634 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.849646 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.849665 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.849675 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:47Z","lastTransitionTime":"2025-10-06T14:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.952147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.952211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.952233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.952263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:47 crc kubenswrapper[4763]: I1006 14:54:47.952285 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:47Z","lastTransitionTime":"2025-10-06T14:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.055524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.055592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.055638 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.055665 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.055796 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:48Z","lastTransitionTime":"2025-10-06T14:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.158202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.158244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.158256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.158272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.158284 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:48Z","lastTransitionTime":"2025-10-06T14:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.260779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.260843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.260882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.260915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.260937 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:48Z","lastTransitionTime":"2025-10-06T14:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.364452 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.364529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.364547 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.364568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.364584 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:48Z","lastTransitionTime":"2025-10-06T14:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.417697 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.417815 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.417874 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.417941 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.417906204 +0000 UTC m=+149.573198726 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.417978 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.418045 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.418143 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.418132761 +0000 UTC m=+149.573425273 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.418240 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.418287 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.418278266 +0000 UTC m=+149.573570778 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.418324 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.418346 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.418363 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.418432 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.4184106 +0000 UTC m=+149.573703142 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.467492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.467914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.468048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.468239 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.468412 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:48Z","lastTransitionTime":"2025-10-06T14:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.519694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.519900 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.519924 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.519941 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.519997 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.519979917 +0000 UTC m=+149.675272429 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.572019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.572324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.572461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.572553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.572689 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:48Z","lastTransitionTime":"2025-10-06T14:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.574425 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.574445 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.574592 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:48 crc kubenswrapper[4763]: E1006 14:54:48.574777 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.675146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.675221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.675240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.675269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.675287 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:48Z","lastTransitionTime":"2025-10-06T14:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.777999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.778303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.778404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.778487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.778563 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:48Z","lastTransitionTime":"2025-10-06T14:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.882053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.882367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.882555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.882750 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.882882 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:48Z","lastTransitionTime":"2025-10-06T14:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.985769 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.985842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.985859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.985884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:48 crc kubenswrapper[4763]: I1006 14:54:48.985901 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:48Z","lastTransitionTime":"2025-10-06T14:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.088683 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.088760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.088781 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.088807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.088826 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:49Z","lastTransitionTime":"2025-10-06T14:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.192101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.192178 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.192198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.192225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.192245 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:49Z","lastTransitionTime":"2025-10-06T14:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.295704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.295799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.295817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.295841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.295859 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:49Z","lastTransitionTime":"2025-10-06T14:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.398355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.398430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.398448 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.398472 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.398493 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:49Z","lastTransitionTime":"2025-10-06T14:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.501706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.501830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.501848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.501871 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.501887 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:49Z","lastTransitionTime":"2025-10-06T14:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.574340 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.574553 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:49 crc kubenswrapper[4763]: E1006 14:54:49.574795 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:49 crc kubenswrapper[4763]: E1006 14:54:49.574967 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.605652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.606249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.606281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.606310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.606332 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:49Z","lastTransitionTime":"2025-10-06T14:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.710004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.710058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.710076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.710099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.710115 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:49Z","lastTransitionTime":"2025-10-06T14:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.813361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.813435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.813459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.813486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.813506 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:49Z","lastTransitionTime":"2025-10-06T14:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.916341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.916401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.916418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.916441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:49 crc kubenswrapper[4763]: I1006 14:54:49.916460 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:49Z","lastTransitionTime":"2025-10-06T14:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.020674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.020773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.020793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.020890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.020969 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:50Z","lastTransitionTime":"2025-10-06T14:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.124243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.124323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.124340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.124365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.124387 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:50Z","lastTransitionTime":"2025-10-06T14:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.227685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.227790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.227812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.227835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.227856 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:50Z","lastTransitionTime":"2025-10-06T14:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.331133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.331214 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.331237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.331266 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.331289 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:50Z","lastTransitionTime":"2025-10-06T14:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.435776 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.435833 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.435849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.435872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.435894 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:50Z","lastTransitionTime":"2025-10-06T14:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.539352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.539424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.539444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.539467 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.539484 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:50Z","lastTransitionTime":"2025-10-06T14:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.574460 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.574515 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:50 crc kubenswrapper[4763]: E1006 14:54:50.574698 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:50 crc kubenswrapper[4763]: E1006 14:54:50.574845 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.642809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.642886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.642910 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.642941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.642964 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:50Z","lastTransitionTime":"2025-10-06T14:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.745787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.745883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.745900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.745961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.745980 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:50Z","lastTransitionTime":"2025-10-06T14:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.854161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.854242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.854255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.854274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.854291 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:50Z","lastTransitionTime":"2025-10-06T14:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.958878 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.958942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.958983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.959007 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:50 crc kubenswrapper[4763]: I1006 14:54:50.959024 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:50Z","lastTransitionTime":"2025-10-06T14:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.061652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.061733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.061751 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.061776 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.061794 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:51Z","lastTransitionTime":"2025-10-06T14:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.165887 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.165961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.165984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.166011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.166032 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:51Z","lastTransitionTime":"2025-10-06T14:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.268950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.269016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.269039 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.269069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.269090 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:51Z","lastTransitionTime":"2025-10-06T14:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.371819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.371863 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.371875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.371894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.371910 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:51Z","lastTransitionTime":"2025-10-06T14:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.474618 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.474701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.474719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.474743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.474760 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:51Z","lastTransitionTime":"2025-10-06T14:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.574305 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.574404 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:51 crc kubenswrapper[4763]: E1006 14:54:51.574562 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:51 crc kubenswrapper[4763]: E1006 14:54:51.575127 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.577855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.577911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.577930 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.577955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.577973 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:51Z","lastTransitionTime":"2025-10-06T14:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.680872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.681001 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.681029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.681059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.681082 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:51Z","lastTransitionTime":"2025-10-06T14:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.783942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.784009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.784036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.784065 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.784087 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:51Z","lastTransitionTime":"2025-10-06T14:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.887248 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.887313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.887346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.887410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.887436 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:51Z","lastTransitionTime":"2025-10-06T14:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.990587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.990687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.990703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.990724 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:51 crc kubenswrapper[4763]: I1006 14:54:51.990740 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:51Z","lastTransitionTime":"2025-10-06T14:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.092977 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.093006 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.093014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.093029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.093040 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:52Z","lastTransitionTime":"2025-10-06T14:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.195003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.195040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.195052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.195066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.195076 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:52Z","lastTransitionTime":"2025-10-06T14:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.298149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.298223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.298246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.298276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.298299 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:52Z","lastTransitionTime":"2025-10-06T14:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.400880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.400942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.400983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.401015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.401038 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:52Z","lastTransitionTime":"2025-10-06T14:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.504248 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.504325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.504347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.504374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.504395 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:52Z","lastTransitionTime":"2025-10-06T14:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.574652 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:52 crc kubenswrapper[4763]: E1006 14:54:52.574808 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.574658 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:52 crc kubenswrapper[4763]: E1006 14:54:52.575045 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.607187 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.607223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.607233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.607248 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.607259 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:52Z","lastTransitionTime":"2025-10-06T14:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.709701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.710064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.710159 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.710230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.710249 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:52Z","lastTransitionTime":"2025-10-06T14:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.812560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.812606 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.812634 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.812649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.812662 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:52Z","lastTransitionTime":"2025-10-06T14:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.915290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.915365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.915388 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.915424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:52 crc kubenswrapper[4763]: I1006 14:54:52.915446 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:52Z","lastTransitionTime":"2025-10-06T14:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.019058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.019127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.019146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.019171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.019187 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:53Z","lastTransitionTime":"2025-10-06T14:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.121830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.121900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.121927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.121954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.121975 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:53Z","lastTransitionTime":"2025-10-06T14:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.224662 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.224767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.224790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.224818 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.224843 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:53Z","lastTransitionTime":"2025-10-06T14:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.327594 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.327711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.327735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.327768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.327792 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:53Z","lastTransitionTime":"2025-10-06T14:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.435549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.435685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.435714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.435743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.435765 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:53Z","lastTransitionTime":"2025-10-06T14:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.539342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.539464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.539486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.539516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.539536 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:53Z","lastTransitionTime":"2025-10-06T14:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.574671 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.574751 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:53 crc kubenswrapper[4763]: E1006 14:54:53.574835 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:53 crc kubenswrapper[4763]: E1006 14:54:53.575022 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.642672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.642749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.642773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.642804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.642828 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:53Z","lastTransitionTime":"2025-10-06T14:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.693684 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podStartSLOduration=68.693652605 podStartE2EDuration="1m8.693652605s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:53.691134969 +0000 UTC m=+90.846427511" watchObservedRunningTime="2025-10-06 14:54:53.693652605 +0000 UTC m=+90.848945147" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.707038 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gqz6f" podStartSLOduration=69.707009861 podStartE2EDuration="1m9.707009861s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:53.706440724 +0000 UTC m=+90.861733266" watchObservedRunningTime="2025-10-06 14:54:53.707009861 +0000 UTC m=+90.862302413" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.731194 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tzrc6" podStartSLOduration=68.731160496 podStartE2EDuration="1m8.731160496s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:53.729205916 +0000 UTC m=+90.884498468" watchObservedRunningTime="2025-10-06 14:54:53.731160496 +0000 UTC m=+90.886453068" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.745343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.745410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.745430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.745457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.745477 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:53Z","lastTransitionTime":"2025-10-06T14:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.766692 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=69.766672195 podStartE2EDuration="1m9.766672195s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:53.7658627 +0000 UTC m=+90.921155232" watchObservedRunningTime="2025-10-06 14:54:53.766672195 +0000 UTC m=+90.921964717" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.767337 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=16.767327405 podStartE2EDuration="16.767327405s" podCreationTimestamp="2025-10-06 14:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:53.743581543 +0000 UTC m=+90.898874125" watchObservedRunningTime="2025-10-06 14:54:53.767327405 +0000 UTC m=+90.922619927" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.783220 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=37.783195867 podStartE2EDuration="37.783195867s" podCreationTimestamp="2025-10-06 14:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:53.783181987 +0000 UTC m=+90.938474529" watchObservedRunningTime="2025-10-06 14:54:53.783195867 +0000 UTC m=+90.938488419" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.847644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.847692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.847705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.847761 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.847778 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:53Z","lastTransitionTime":"2025-10-06T14:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.860372 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j9x9s" podStartSLOduration=69.860348423 podStartE2EDuration="1m9.860348423s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:53.846710928 +0000 UTC m=+91.002003480" watchObservedRunningTime="2025-10-06 14:54:53.860348423 +0000 UTC m=+91.015640975" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.876062 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tzrds" podStartSLOduration=68.87604224 podStartE2EDuration="1m8.87604224s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:53.860094285 +0000 UTC m=+91.015386797" watchObservedRunningTime="2025-10-06 14:54:53.87604224 +0000 UTC m=+91.031334792" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.876568 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.876561136 podStartE2EDuration="1m7.876561136s" podCreationTimestamp="2025-10-06 14:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:53.875361069 +0000 UTC m=+91.030653581" watchObservedRunningTime="2025-10-06 14:54:53.876561136 +0000 UTC m=+91.031853688" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.925765 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bj6z5" podStartSLOduration=68.925745721 podStartE2EDuration="1m8.925745721s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:53.92472709 +0000 UTC m=+91.080019652" watchObservedRunningTime="2025-10-06 14:54:53.925745721 +0000 UTC m=+91.081038243" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.950814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.950855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.950869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.950904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:53 crc kubenswrapper[4763]: I1006 14:54:53.950916 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:53Z","lastTransitionTime":"2025-10-06T14:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.053218 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.053256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.053268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.053283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.053295 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:54Z","lastTransitionTime":"2025-10-06T14:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.156422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.156481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.156531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.157212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.157337 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:54Z","lastTransitionTime":"2025-10-06T14:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.260770 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.260844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.260862 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.260890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.260910 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:54Z","lastTransitionTime":"2025-10-06T14:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.364342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.364398 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.364414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.364435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.364469 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:54Z","lastTransitionTime":"2025-10-06T14:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.467320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.467357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.467365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.467380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.467389 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:54Z","lastTransitionTime":"2025-10-06T14:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.569519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.569566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.569576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.569594 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.569606 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:54Z","lastTransitionTime":"2025-10-06T14:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.574783 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.574791 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:54 crc kubenswrapper[4763]: E1006 14:54:54.574920 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:54 crc kubenswrapper[4763]: E1006 14:54:54.575021 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.672334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.672413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.672435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.672465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.672490 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:54Z","lastTransitionTime":"2025-10-06T14:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.776067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.776121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.776139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.776162 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.776179 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:54Z","lastTransitionTime":"2025-10-06T14:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.879228 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.879284 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.879303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.879326 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.879342 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:54Z","lastTransitionTime":"2025-10-06T14:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.982828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.982888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.982904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.982955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:54 crc kubenswrapper[4763]: I1006 14:54:54.982973 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:54Z","lastTransitionTime":"2025-10-06T14:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.086971 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.087035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.087052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.087074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.087092 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:55Z","lastTransitionTime":"2025-10-06T14:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.190116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.190192 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.190214 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.190244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.190266 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:55Z","lastTransitionTime":"2025-10-06T14:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.294254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.294342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.294368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.294397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.294421 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:55Z","lastTransitionTime":"2025-10-06T14:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.396925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.396982 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.396999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.397021 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.397040 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:55Z","lastTransitionTime":"2025-10-06T14:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.499706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.499763 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.499784 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.499814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.499836 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:55Z","lastTransitionTime":"2025-10-06T14:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.574931 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.574987 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:55 crc kubenswrapper[4763]: E1006 14:54:55.575161 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:55 crc kubenswrapper[4763]: E1006 14:54:55.575324 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.601999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.602061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.602084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.602113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.602137 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:55Z","lastTransitionTime":"2025-10-06T14:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.610257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.610289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.610297 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.610308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.610334 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T14:54:55Z","lastTransitionTime":"2025-10-06T14:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.681063 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8"] Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.681374 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.684077 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.684147 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.684109 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.687517 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.807361 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25345f25-70d4-416c-a8dd-84013d0823d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.807676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25345f25-70d4-416c-a8dd-84013d0823d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.807804 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25345f25-70d4-416c-a8dd-84013d0823d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.807948 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25345f25-70d4-416c-a8dd-84013d0823d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.808002 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25345f25-70d4-416c-a8dd-84013d0823d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.909198 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25345f25-70d4-416c-a8dd-84013d0823d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.909404 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25345f25-70d4-416c-a8dd-84013d0823d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.909440 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25345f25-70d4-416c-a8dd-84013d0823d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.909502 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25345f25-70d4-416c-a8dd-84013d0823d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.909543 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25345f25-70d4-416c-a8dd-84013d0823d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.911034 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25345f25-70d4-416c-a8dd-84013d0823d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.911131 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25345f25-70d4-416c-a8dd-84013d0823d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.911288 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25345f25-70d4-416c-a8dd-84013d0823d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.919840 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25345f25-70d4-416c-a8dd-84013d0823d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:55 crc kubenswrapper[4763]: I1006 14:54:55.942149 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25345f25-70d4-416c-a8dd-84013d0823d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nq9c8\" (UID: \"25345f25-70d4-416c-a8dd-84013d0823d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:56 crc kubenswrapper[4763]: I1006 14:54:56.002244 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" Oct 06 14:54:56 crc kubenswrapper[4763]: W1006 14:54:56.024673 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25345f25_70d4_416c_a8dd_84013d0823d5.slice/crio-1e55f86d249621ec4d3f7d6d11c773a0d5fa4eeee4c3a48900b1ea73933a9fca WatchSource:0}: Error finding container 1e55f86d249621ec4d3f7d6d11c773a0d5fa4eeee4c3a48900b1ea73933a9fca: Status 404 returned error can't find the container with id 1e55f86d249621ec4d3f7d6d11c773a0d5fa4eeee4c3a48900b1ea73933a9fca Oct 06 14:54:56 crc kubenswrapper[4763]: I1006 14:54:56.115590 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" event={"ID":"25345f25-70d4-416c-a8dd-84013d0823d5","Type":"ContainerStarted","Data":"1e55f86d249621ec4d3f7d6d11c773a0d5fa4eeee4c3a48900b1ea73933a9fca"} Oct 06 14:54:56 crc kubenswrapper[4763]: I1006 14:54:56.574258 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:56 crc kubenswrapper[4763]: I1006 14:54:56.574316 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:56 crc kubenswrapper[4763]: E1006 14:54:56.574717 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:56 crc kubenswrapper[4763]: E1006 14:54:56.574923 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:57 crc kubenswrapper[4763]: I1006 14:54:57.123467 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" event={"ID":"25345f25-70d4-416c-a8dd-84013d0823d5","Type":"ContainerStarted","Data":"ad599a536ff5863bd81eab4bc5e85c161ff6725c6358b0cd39617989417173f3"} Oct 06 14:54:57 crc kubenswrapper[4763]: I1006 14:54:57.144130 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nq9c8" podStartSLOduration=73.144106768 podStartE2EDuration="1m13.144106768s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:54:57.142610752 +0000 UTC m=+94.297903304" watchObservedRunningTime="2025-10-06 14:54:57.144106768 +0000 UTC m=+94.299399320" Oct 06 14:54:57 crc kubenswrapper[4763]: I1006 14:54:57.574180 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:57 crc kubenswrapper[4763]: I1006 14:54:57.574181 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:57 crc kubenswrapper[4763]: E1006 14:54:57.574382 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:57 crc kubenswrapper[4763]: E1006 14:54:57.574542 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:54:58 crc kubenswrapper[4763]: I1006 14:54:58.574525 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:54:58 crc kubenswrapper[4763]: I1006 14:54:58.574534 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:54:58 crc kubenswrapper[4763]: E1006 14:54:58.574741 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:54:58 crc kubenswrapper[4763]: E1006 14:54:58.574845 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:54:59 crc kubenswrapper[4763]: I1006 14:54:59.574142 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:54:59 crc kubenswrapper[4763]: I1006 14:54:59.574583 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:54:59 crc kubenswrapper[4763]: E1006 14:54:59.574880 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:54:59 crc kubenswrapper[4763]: E1006 14:54:59.575195 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:00 crc kubenswrapper[4763]: I1006 14:55:00.574862 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:00 crc kubenswrapper[4763]: I1006 14:55:00.574894 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:00 crc kubenswrapper[4763]: E1006 14:55:00.575090 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:00 crc kubenswrapper[4763]: E1006 14:55:00.575220 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:01 crc kubenswrapper[4763]: I1006 14:55:01.574673 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:01 crc kubenswrapper[4763]: E1006 14:55:01.574813 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:01 crc kubenswrapper[4763]: I1006 14:55:01.574835 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:01 crc kubenswrapper[4763]: E1006 14:55:01.575186 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:01 crc kubenswrapper[4763]: I1006 14:55:01.575467 4763 scope.go:117] "RemoveContainer" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 14:55:01 crc kubenswrapper[4763]: E1006 14:55:01.575641 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" Oct 06 14:55:02 crc kubenswrapper[4763]: I1006 14:55:02.574080 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:02 crc kubenswrapper[4763]: I1006 14:55:02.574092 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:02 crc kubenswrapper[4763]: E1006 14:55:02.574313 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:02 crc kubenswrapper[4763]: E1006 14:55:02.574412 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:02 crc kubenswrapper[4763]: I1006 14:55:02.591656 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 06 14:55:03 crc kubenswrapper[4763]: I1006 14:55:03.574828 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:03 crc kubenswrapper[4763]: I1006 14:55:03.574857 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:03 crc kubenswrapper[4763]: E1006 14:55:03.576217 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:03 crc kubenswrapper[4763]: E1006 14:55:03.576410 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:03 crc kubenswrapper[4763]: I1006 14:55:03.622556 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.622540259 podStartE2EDuration="1.622540259s" podCreationTimestamp="2025-10-06 14:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:03.621356633 +0000 UTC m=+100.776649155" watchObservedRunningTime="2025-10-06 14:55:03.622540259 +0000 UTC m=+100.777832781" Oct 06 14:55:04 crc kubenswrapper[4763]: I1006 14:55:04.121051 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:04 crc kubenswrapper[4763]: E1006 14:55:04.121215 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:55:04 crc kubenswrapper[4763]: E1006 14:55:04.121290 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs podName:d6aeb0e7-db42-449d-8052-fc68154e93d2 nodeName:}" failed. No retries permitted until 2025-10-06 14:56:08.12126968 +0000 UTC m=+165.276562202 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs") pod "network-metrics-daemon-hgd8l" (UID: "d6aeb0e7-db42-449d-8052-fc68154e93d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 14:55:04 crc kubenswrapper[4763]: I1006 14:55:04.574410 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:04 crc kubenswrapper[4763]: I1006 14:55:04.574410 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:04 crc kubenswrapper[4763]: E1006 14:55:04.574670 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:04 crc kubenswrapper[4763]: E1006 14:55:04.574782 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:05 crc kubenswrapper[4763]: I1006 14:55:05.574778 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:05 crc kubenswrapper[4763]: I1006 14:55:05.574788 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:05 crc kubenswrapper[4763]: E1006 14:55:05.574981 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:05 crc kubenswrapper[4763]: E1006 14:55:05.575110 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:06 crc kubenswrapper[4763]: I1006 14:55:06.574579 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:06 crc kubenswrapper[4763]: E1006 14:55:06.574816 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:06 crc kubenswrapper[4763]: I1006 14:55:06.574582 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:06 crc kubenswrapper[4763]: E1006 14:55:06.575200 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:07 crc kubenswrapper[4763]: I1006 14:55:07.574680 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:07 crc kubenswrapper[4763]: I1006 14:55:07.574698 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:07 crc kubenswrapper[4763]: E1006 14:55:07.574882 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:07 crc kubenswrapper[4763]: E1006 14:55:07.575036 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:08 crc kubenswrapper[4763]: I1006 14:55:08.574300 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:08 crc kubenswrapper[4763]: I1006 14:55:08.574328 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:08 crc kubenswrapper[4763]: E1006 14:55:08.574460 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:08 crc kubenswrapper[4763]: E1006 14:55:08.574652 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:09 crc kubenswrapper[4763]: I1006 14:55:09.574887 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:09 crc kubenswrapper[4763]: E1006 14:55:09.575077 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:09 crc kubenswrapper[4763]: I1006 14:55:09.575212 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:09 crc kubenswrapper[4763]: E1006 14:55:09.575346 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:10 crc kubenswrapper[4763]: I1006 14:55:10.574303 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:10 crc kubenswrapper[4763]: I1006 14:55:10.574305 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:10 crc kubenswrapper[4763]: E1006 14:55:10.574489 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:10 crc kubenswrapper[4763]: E1006 14:55:10.574749 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:11 crc kubenswrapper[4763]: I1006 14:55:11.574143 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:11 crc kubenswrapper[4763]: I1006 14:55:11.574212 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:11 crc kubenswrapper[4763]: E1006 14:55:11.575021 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:11 crc kubenswrapper[4763]: E1006 14:55:11.575163 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:12 crc kubenswrapper[4763]: I1006 14:55:12.575071 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:12 crc kubenswrapper[4763]: I1006 14:55:12.575071 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:12 crc kubenswrapper[4763]: E1006 14:55:12.575287 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:12 crc kubenswrapper[4763]: E1006 14:55:12.575414 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:13 crc kubenswrapper[4763]: I1006 14:55:13.574979 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:13 crc kubenswrapper[4763]: E1006 14:55:13.576855 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:13 crc kubenswrapper[4763]: I1006 14:55:13.576942 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:13 crc kubenswrapper[4763]: E1006 14:55:13.577072 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:14 crc kubenswrapper[4763]: I1006 14:55:14.574023 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:14 crc kubenswrapper[4763]: I1006 14:55:14.574034 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:14 crc kubenswrapper[4763]: E1006 14:55:14.574350 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:14 crc kubenswrapper[4763]: E1006 14:55:14.574498 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:14 crc kubenswrapper[4763]: I1006 14:55:14.575800 4763 scope.go:117] "RemoveContainer" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 14:55:14 crc kubenswrapper[4763]: E1006 14:55:14.576065 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jnftg_openshift-ovn-kubernetes(fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" Oct 06 14:55:15 crc kubenswrapper[4763]: I1006 14:55:15.574303 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:15 crc kubenswrapper[4763]: I1006 14:55:15.574343 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:15 crc kubenswrapper[4763]: E1006 14:55:15.575400 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:15 crc kubenswrapper[4763]: E1006 14:55:15.575437 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:16 crc kubenswrapper[4763]: I1006 14:55:16.574660 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:16 crc kubenswrapper[4763]: I1006 14:55:16.574736 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:16 crc kubenswrapper[4763]: E1006 14:55:16.574916 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:16 crc kubenswrapper[4763]: E1006 14:55:16.575140 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:17 crc kubenswrapper[4763]: I1006 14:55:17.575097 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:17 crc kubenswrapper[4763]: I1006 14:55:17.575153 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:17 crc kubenswrapper[4763]: E1006 14:55:17.575275 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:17 crc kubenswrapper[4763]: E1006 14:55:17.575754 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:18 crc kubenswrapper[4763]: I1006 14:55:18.574828 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:18 crc kubenswrapper[4763]: E1006 14:55:18.575571 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:18 crc kubenswrapper[4763]: I1006 14:55:18.574910 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:18 crc kubenswrapper[4763]: E1006 14:55:18.576035 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:19 crc kubenswrapper[4763]: I1006 14:55:19.195545 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/1.log" Oct 06 14:55:19 crc kubenswrapper[4763]: I1006 14:55:19.196405 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/0.log" Oct 06 14:55:19 crc kubenswrapper[4763]: I1006 14:55:19.196492 4763 generic.go:334] "Generic (PLEG): container finished" podID="22f7ff70-c0ad-406d-aa9d-6824cb935c66" containerID="ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936" exitCode=1 Oct 06 14:55:19 crc kubenswrapper[4763]: I1006 14:55:19.196542 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bj6z5" event={"ID":"22f7ff70-c0ad-406d-aa9d-6824cb935c66","Type":"ContainerDied","Data":"ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936"} Oct 06 14:55:19 crc kubenswrapper[4763]: I1006 14:55:19.196589 4763 scope.go:117] "RemoveContainer" containerID="6733a2c956244253b2230b5dabbaae7a749dbc84e6fbf211696169527640bd7f" Oct 06 14:55:19 crc kubenswrapper[4763]: I1006 14:55:19.197344 4763 scope.go:117] "RemoveContainer" containerID="ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936" Oct 06 14:55:19 crc kubenswrapper[4763]: E1006 14:55:19.197677 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bj6z5_openshift-multus(22f7ff70-c0ad-406d-aa9d-6824cb935c66)\"" pod="openshift-multus/multus-bj6z5" podUID="22f7ff70-c0ad-406d-aa9d-6824cb935c66" Oct 06 14:55:19 crc kubenswrapper[4763]: I1006 14:55:19.574342 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:19 crc kubenswrapper[4763]: I1006 14:55:19.574455 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:19 crc kubenswrapper[4763]: E1006 14:55:19.575267 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:19 crc kubenswrapper[4763]: E1006 14:55:19.575281 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:20 crc kubenswrapper[4763]: I1006 14:55:20.200800 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/1.log" Oct 06 14:55:20 crc kubenswrapper[4763]: I1006 14:55:20.574918 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:20 crc kubenswrapper[4763]: I1006 14:55:20.574970 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:20 crc kubenswrapper[4763]: E1006 14:55:20.575093 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:20 crc kubenswrapper[4763]: E1006 14:55:20.575211 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:21 crc kubenswrapper[4763]: I1006 14:55:21.574748 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:21 crc kubenswrapper[4763]: I1006 14:55:21.574830 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:21 crc kubenswrapper[4763]: E1006 14:55:21.574955 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:21 crc kubenswrapper[4763]: E1006 14:55:21.575137 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:22 crc kubenswrapper[4763]: I1006 14:55:22.574793 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:22 crc kubenswrapper[4763]: I1006 14:55:22.574809 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:22 crc kubenswrapper[4763]: E1006 14:55:22.575122 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:22 crc kubenswrapper[4763]: E1006 14:55:22.575238 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:23 crc kubenswrapper[4763]: E1006 14:55:23.558706 4763 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 06 14:55:23 crc kubenswrapper[4763]: I1006 14:55:23.574841 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:23 crc kubenswrapper[4763]: I1006 14:55:23.574952 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:23 crc kubenswrapper[4763]: E1006 14:55:23.575855 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:23 crc kubenswrapper[4763]: E1006 14:55:23.575968 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:23 crc kubenswrapper[4763]: E1006 14:55:23.671894 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 14:55:24 crc kubenswrapper[4763]: I1006 14:55:24.574740 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:24 crc kubenswrapper[4763]: I1006 14:55:24.574808 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:24 crc kubenswrapper[4763]: E1006 14:55:24.574890 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:24 crc kubenswrapper[4763]: E1006 14:55:24.574948 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:25 crc kubenswrapper[4763]: I1006 14:55:25.574329 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:25 crc kubenswrapper[4763]: I1006 14:55:25.574380 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:25 crc kubenswrapper[4763]: E1006 14:55:25.575161 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:25 crc kubenswrapper[4763]: E1006 14:55:25.575294 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:26 crc kubenswrapper[4763]: I1006 14:55:26.574888 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:26 crc kubenswrapper[4763]: I1006 14:55:26.574885 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:26 crc kubenswrapper[4763]: E1006 14:55:26.575960 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:26 crc kubenswrapper[4763]: E1006 14:55:26.576045 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:26 crc kubenswrapper[4763]: I1006 14:55:26.577276 4763 scope.go:117] "RemoveContainer" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 14:55:27 crc kubenswrapper[4763]: I1006 14:55:27.224267 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/3.log" Oct 06 14:55:27 crc kubenswrapper[4763]: I1006 14:55:27.226669 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerStarted","Data":"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41"} Oct 06 14:55:27 crc kubenswrapper[4763]: I1006 14:55:27.227181 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:55:27 crc kubenswrapper[4763]: I1006 14:55:27.534423 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podStartSLOduration=102.53439202 podStartE2EDuration="1m42.53439202s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:27.266005572 +0000 UTC m=+124.421298084" watchObservedRunningTime="2025-10-06 14:55:27.53439202 +0000 UTC m=+124.689684522" Oct 06 14:55:27 crc kubenswrapper[4763]: I1006 14:55:27.534842 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hgd8l"] Oct 06 14:55:27 crc kubenswrapper[4763]: I1006 14:55:27.534989 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:27 crc kubenswrapper[4763]: E1006 14:55:27.535114 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:27 crc kubenswrapper[4763]: I1006 14:55:27.573987 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:27 crc kubenswrapper[4763]: E1006 14:55:27.574138 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:28 crc kubenswrapper[4763]: I1006 14:55:28.574877 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:28 crc kubenswrapper[4763]: I1006 14:55:28.574877 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:28 crc kubenswrapper[4763]: E1006 14:55:28.575369 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:28 crc kubenswrapper[4763]: E1006 14:55:28.575484 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:28 crc kubenswrapper[4763]: E1006 14:55:28.673465 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 14:55:29 crc kubenswrapper[4763]: I1006 14:55:29.574850 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:29 crc kubenswrapper[4763]: E1006 14:55:29.575163 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:29 crc kubenswrapper[4763]: I1006 14:55:29.574865 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:29 crc kubenswrapper[4763]: E1006 14:55:29.575536 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:30 crc kubenswrapper[4763]: I1006 14:55:30.574364 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:30 crc kubenswrapper[4763]: I1006 14:55:30.574371 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:30 crc kubenswrapper[4763]: E1006 14:55:30.575110 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:30 crc kubenswrapper[4763]: E1006 14:55:30.575295 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:31 crc kubenswrapper[4763]: I1006 14:55:31.574320 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:31 crc kubenswrapper[4763]: E1006 14:55:31.574512 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:31 crc kubenswrapper[4763]: I1006 14:55:31.574827 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:31 crc kubenswrapper[4763]: E1006 14:55:31.574979 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:32 crc kubenswrapper[4763]: I1006 14:55:32.574139 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:32 crc kubenswrapper[4763]: I1006 14:55:32.574139 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:32 crc kubenswrapper[4763]: E1006 14:55:32.574336 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:32 crc kubenswrapper[4763]: E1006 14:55:32.574495 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:33 crc kubenswrapper[4763]: I1006 14:55:33.574891 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:33 crc kubenswrapper[4763]: I1006 14:55:33.574948 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:33 crc kubenswrapper[4763]: E1006 14:55:33.577890 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:33 crc kubenswrapper[4763]: E1006 14:55:33.578078 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:33 crc kubenswrapper[4763]: I1006 14:55:33.578599 4763 scope.go:117] "RemoveContainer" containerID="ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936" Oct 06 14:55:33 crc kubenswrapper[4763]: E1006 14:55:33.674224 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 14:55:34 crc kubenswrapper[4763]: I1006 14:55:34.281331 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/1.log" Oct 06 14:55:34 crc kubenswrapper[4763]: I1006 14:55:34.281827 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bj6z5" event={"ID":"22f7ff70-c0ad-406d-aa9d-6824cb935c66","Type":"ContainerStarted","Data":"5749f4dfb5b91f18b41e4f51cd16226f76c1271954a7e8f76f3eda60bcf6cdf7"} Oct 06 14:55:34 crc kubenswrapper[4763]: I1006 14:55:34.433592 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 14:55:34 crc kubenswrapper[4763]: I1006 14:55:34.574658 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:34 crc kubenswrapper[4763]: I1006 14:55:34.574692 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:34 crc kubenswrapper[4763]: E1006 14:55:34.574777 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:34 crc kubenswrapper[4763]: E1006 14:55:34.574917 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:35 crc kubenswrapper[4763]: I1006 14:55:35.574597 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:35 crc kubenswrapper[4763]: I1006 14:55:35.574793 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:35 crc kubenswrapper[4763]: E1006 14:55:35.574969 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:35 crc kubenswrapper[4763]: E1006 14:55:35.575116 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:36 crc kubenswrapper[4763]: I1006 14:55:36.574974 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:36 crc kubenswrapper[4763]: I1006 14:55:36.575047 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:36 crc kubenswrapper[4763]: E1006 14:55:36.575113 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:36 crc kubenswrapper[4763]: E1006 14:55:36.575253 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:37 crc kubenswrapper[4763]: I1006 14:55:37.574074 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:37 crc kubenswrapper[4763]: I1006 14:55:37.574350 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:37 crc kubenswrapper[4763]: E1006 14:55:37.574499 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgd8l" podUID="d6aeb0e7-db42-449d-8052-fc68154e93d2" Oct 06 14:55:37 crc kubenswrapper[4763]: E1006 14:55:37.574608 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 14:55:38 crc kubenswrapper[4763]: I1006 14:55:38.573911 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:38 crc kubenswrapper[4763]: E1006 14:55:38.574046 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 14:55:38 crc kubenswrapper[4763]: I1006 14:55:38.573911 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:38 crc kubenswrapper[4763]: E1006 14:55:38.574130 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 14:55:39 crc kubenswrapper[4763]: I1006 14:55:39.585029 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:39 crc kubenswrapper[4763]: I1006 14:55:39.585900 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:55:39 crc kubenswrapper[4763]: I1006 14:55:39.588829 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 06 14:55:39 crc kubenswrapper[4763]: I1006 14:55:39.588995 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 06 14:55:39 crc kubenswrapper[4763]: I1006 14:55:39.589434 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 06 14:55:39 crc kubenswrapper[4763]: I1006 14:55:39.589831 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 06 14:55:40 crc kubenswrapper[4763]: I1006 14:55:40.574455 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:40 crc kubenswrapper[4763]: I1006 14:55:40.574465 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:40 crc kubenswrapper[4763]: I1006 14:55:40.576979 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 06 14:55:40 crc kubenswrapper[4763]: I1006 14:55:40.577286 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.708877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.765267 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pr9h9"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.766024 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.768123 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.768807 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.769262 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.770157 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.770573 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.798971 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.799173 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.800114 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.802156 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-w65lk"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.802808 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.803947 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.804086 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.804442 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.804816 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.806132 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.806213 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.806779 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.816474 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.818719 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wqpfm"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.819058 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xlb8f"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.819415 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.819934 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.823253 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-7n989"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.823864 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-phqz6"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.824314 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.824774 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.824879 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7n989" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.832071 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869546 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869593 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869634 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bfc8a48-e756-488e-a468-3f328ca87a48-serving-cert\") pod \"openshift-config-operator-7777fb866f-w65lk\" (UID: \"3bfc8a48-e756-488e-a468-3f328ca87a48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bb70296-8159-45af-9860-f681e6e8af61-serving-cert\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-auth-proxy-config\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869698 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-image-import-ca\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869718 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8bb70296-8159-45af-9860-f681e6e8af61-node-pullsecrets\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d8c3b2-9459-472b-b8cb-54421193044a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869759 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753eb322-6173-4cf4-bed7-28c5ed218b3d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-csnzk\" (UID: \"753eb322-6173-4cf4-bed7-28c5ed218b3d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869790 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d8c3b2-9459-472b-b8cb-54421193044a-service-ca-bundle\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869809 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p67w6\" (UniqueName: \"kubernetes.io/projected/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-kube-api-access-p67w6\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869833 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-config\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869854 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95vsl\" (UniqueName: \"kubernetes.io/projected/3bfc8a48-e756-488e-a468-3f328ca87a48-kube-api-access-95vsl\") pod \"openshift-config-operator-7777fb866f-w65lk\" (UID: \"3bfc8a48-e756-488e-a468-3f328ca87a48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869880 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-config\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869898 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-machine-approver-tls\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869918 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-config\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869939 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb54v\" (UniqueName: \"kubernetes.io/projected/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-kube-api-access-qb54v\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869959 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753eb322-6173-4cf4-bed7-28c5ed218b3d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-csnzk\" (UID: \"753eb322-6173-4cf4-bed7-28c5ed218b3d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869979 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.869998 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx586\" (UniqueName: \"kubernetes.io/projected/b02ef795-1c7c-49fe-bd12-00043942b97f-kube-api-access-jx586\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870017 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8a48-e756-488e-a468-3f328ca87a48-available-featuregates\") pod \"openshift-config-operator-7777fb866f-w65lk\" (UID: \"3bfc8a48-e756-488e-a468-3f328ca87a48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870039 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-audit\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870057 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8bb70296-8159-45af-9860-f681e6e8af61-encryption-config\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870076 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-config\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870094 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-images\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870116 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-client-ca\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870137 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b02ef795-1c7c-49fe-bd12-00043942b97f-serving-cert\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870169 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzzl4\" (UniqueName: \"kubernetes.io/projected/753eb322-6173-4cf4-bed7-28c5ed218b3d-kube-api-access-tzzl4\") pod \"openshift-apiserver-operator-796bbdcf4f-csnzk\" (UID: \"753eb322-6173-4cf4-bed7-28c5ed218b3d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870189 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-serving-cert\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870209 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8bb70296-8159-45af-9860-f681e6e8af61-etcd-client\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870236 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51d8c3b2-9459-472b-b8cb-54421193044a-config\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870256 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9gj\" (UniqueName: \"kubernetes.io/projected/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-kube-api-access-nh9gj\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870278 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wlxm\" (UniqueName: \"kubernetes.io/projected/8bb70296-8159-45af-9860-f681e6e8af61-kube-api-access-2wlxm\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870305 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-client-ca\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870324 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-etcd-serving-ca\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870343 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8bb70296-8159-45af-9860-f681e6e8af61-audit-dir\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870364 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51d8c3b2-9459-472b-b8cb-54421193044a-serving-cert\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870395 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slhwt\" (UniqueName: \"kubernetes.io/projected/51d8c3b2-9459-472b-b8cb-54421193044a-kube-api-access-slhwt\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870414 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-config\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.870437 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzv8k\" (UniqueName: \"kubernetes.io/projected/9ab5712d-bdff-43e3-b9ab-26d452c17259-kube-api-access-dzv8k\") pod \"downloads-7954f5f757-7n989\" (UID: \"9ab5712d-bdff-43e3-b9ab-26d452c17259\") " pod="openshift-console/downloads-7954f5f757-7n989" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.877384 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.877948 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.886989 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.887159 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.887551 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.888036 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9nc4c"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.888339 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.888561 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.888684 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.888788 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.889018 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.889199 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.889300 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.889394 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.889729 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.889807 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.889913 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.889983 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.890160 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.890232 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.890298 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.890401 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.890475 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.894425 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.894434 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.896672 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.897030 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-t2f4r"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.897299 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.898117 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.901083 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.901359 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.901470 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.901694 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.902923 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.903794 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.904296 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.904414 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.904514 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.904625 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.904857 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.904929 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.905028 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.905157 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.905205 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.905290 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.905290 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.905394 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.905407 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.905481 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.905519 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.905676 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.906638 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.907261 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.909240 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.917034 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.920764 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.921503 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.921654 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.921707 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.921819 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.922073 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.922865 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.922966 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923068 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923126 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923196 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923290 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w72px"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923327 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923415 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923499 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923506 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923711 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923805 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923921 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923966 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.923991 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.924057 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.924116 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.925630 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.925848 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9wh2c"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.926274 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.926530 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qvml8"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.926825 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.928505 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.928579 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.928842 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.928850 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.929065 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.929217 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.929347 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.928515 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.930436 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.930868 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.930750 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.931663 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.936876 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.939900 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.958252 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.958898 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.959020 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.959041 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.959570 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.959904 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.960048 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.960377 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.961778 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.962113 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.962780 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.962972 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.963341 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.963686 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.964125 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.964224 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.964838 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.965514 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.968416 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.970533 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.971054 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.971446 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.971981 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzzl4\" (UniqueName: \"kubernetes.io/projected/753eb322-6173-4cf4-bed7-28c5ed218b3d-kube-api-access-tzzl4\") pod \"openshift-apiserver-operator-796bbdcf4f-csnzk\" (UID: \"753eb322-6173-4cf4-bed7-28c5ed218b3d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972007 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-serving-cert\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972026 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8bb70296-8159-45af-9860-f681e6e8af61-etcd-client\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972046 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972067 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wlxm\" (UniqueName: \"kubernetes.io/projected/8bb70296-8159-45af-9860-f681e6e8af61-kube-api-access-2wlxm\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972084 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1dff0313-c10e-40ca-b794-ab8e85c08c8b-trusted-ca\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51d8c3b2-9459-472b-b8cb-54421193044a-config\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972124 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9gj\" (UniqueName: \"kubernetes.io/projected/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-kube-api-access-nh9gj\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972141 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972156 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972172 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-client-ca\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972187 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-etcd-serving-ca\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972203 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8bb70296-8159-45af-9860-f681e6e8af61-audit-dir\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972224 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51d8c3b2-9459-472b-b8cb-54421193044a-serving-cert\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972245 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slhwt\" (UniqueName: \"kubernetes.io/projected/51d8c3b2-9459-472b-b8cb-54421193044a-kube-api-access-slhwt\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972259 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-config\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972277 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dff0313-c10e-40ca-b794-ab8e85c08c8b-serving-cert\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972294 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ebc102c-269a-456a-aacb-fc15bfac28f1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nrz2n\" (UID: \"6ebc102c-269a-456a-aacb-fc15bfac28f1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972312 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzv8k\" (UniqueName: \"kubernetes.io/projected/9ab5712d-bdff-43e3-b9ab-26d452c17259-kube-api-access-dzv8k\") pod \"downloads-7954f5f757-7n989\" (UID: \"9ab5712d-bdff-43e3-b9ab-26d452c17259\") " pod="openshift-console/downloads-7954f5f757-7n989" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972334 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972368 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972386 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bfc8a48-e756-488e-a468-3f328ca87a48-serving-cert\") pod \"openshift-config-operator-7777fb866f-w65lk\" (UID: \"3bfc8a48-e756-488e-a468-3f328ca87a48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972401 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972419 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972434 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bb70296-8159-45af-9860-f681e6e8af61-serving-cert\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972448 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-auth-proxy-config\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972463 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ebc102c-269a-456a-aacb-fc15bfac28f1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nrz2n\" (UID: \"6ebc102c-269a-456a-aacb-fc15bfac28f1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972477 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-image-import-ca\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972497 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpntk\" (UniqueName: \"kubernetes.io/projected/6ebc102c-269a-456a-aacb-fc15bfac28f1-kube-api-access-kpntk\") pod \"openshift-controller-manager-operator-756b6f6bc6-nrz2n\" (UID: \"6ebc102c-269a-456a-aacb-fc15bfac28f1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972513 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8bb70296-8159-45af-9860-f681e6e8af61-node-pullsecrets\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972528 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dff0313-c10e-40ca-b794-ab8e85c08c8b-config\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972544 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d8c3b2-9459-472b-b8cb-54421193044a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972560 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753eb322-6173-4cf4-bed7-28c5ed218b3d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-csnzk\" (UID: \"753eb322-6173-4cf4-bed7-28c5ed218b3d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972574 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p67w6\" (UniqueName: \"kubernetes.io/projected/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-kube-api-access-p67w6\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972597 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d8c3b2-9459-472b-b8cb-54421193044a-service-ca-bundle\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972626 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-config\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972646 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95vsl\" (UniqueName: \"kubernetes.io/projected/3bfc8a48-e756-488e-a468-3f328ca87a48-kube-api-access-95vsl\") pod \"openshift-config-operator-7777fb866f-w65lk\" (UID: \"3bfc8a48-e756-488e-a468-3f328ca87a48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-dir\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972693 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-config\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972707 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-machine-approver-tls\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-policies\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972736 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972751 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-config\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972766 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhqvz\" (UniqueName: \"kubernetes.io/projected/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-kube-api-access-fhqvz\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972781 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753eb322-6173-4cf4-bed7-28c5ed218b3d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-csnzk\" (UID: \"753eb322-6173-4cf4-bed7-28c5ed218b3d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972796 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb54v\" (UniqueName: \"kubernetes.io/projected/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-kube-api-access-qb54v\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972811 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972826 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972840 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx586\" (UniqueName: \"kubernetes.io/projected/b02ef795-1c7c-49fe-bd12-00043942b97f-kube-api-access-jx586\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972855 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8a48-e756-488e-a468-3f328ca87a48-available-featuregates\") pod \"openshift-config-operator-7777fb866f-w65lk\" (UID: \"3bfc8a48-e756-488e-a468-3f328ca87a48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972868 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-audit\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972884 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8bb70296-8159-45af-9860-f681e6e8af61-encryption-config\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972899 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972914 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2kl\" (UniqueName: \"kubernetes.io/projected/1dff0313-c10e-40ca-b794-ab8e85c08c8b-kube-api-access-dh2kl\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972930 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-config\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972944 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-images\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972962 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972978 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-client-ca\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.972992 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b02ef795-1c7c-49fe-bd12-00043942b97f-serving-cert\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.973915 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-image-import-ca\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.977024 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.977344 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vqpmv"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.977603 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.977884 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-config\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.977969 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.978107 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.978443 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51d8c3b2-9459-472b-b8cb-54421193044a-config\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.978876 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-auth-proxy-config\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.979316 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.979740 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.980066 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-audit\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.980103 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-config\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.980192 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8bb70296-8159-45af-9860-f681e6e8af61-node-pullsecrets\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.983754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bfc8a48-e756-488e-a468-3f328ca87a48-serving-cert\") pod \"openshift-config-operator-7777fb866f-w65lk\" (UID: \"3bfc8a48-e756-488e-a468-3f328ca87a48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.984224 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-config\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.984559 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753eb322-6173-4cf4-bed7-28c5ed218b3d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-csnzk\" (UID: \"753eb322-6173-4cf4-bed7-28c5ed218b3d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.984866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-images\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.984931 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-serving-cert\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.985284 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8bb70296-8159-45af-9860-f681e6e8af61-etcd-serving-ca\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.985662 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8bb70296-8159-45af-9860-f681e6e8af61-etcd-client\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.985700 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-client-ca\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.985975 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bb70296-8159-45af-9860-f681e6e8af61-serving-cert\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.986486 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmrz8"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.987464 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-82t6f"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.988067 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.988424 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.989449 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d8c3b2-9459-472b-b8cb-54421193044a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.989488 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.989628 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8bb70296-8159-45af-9860-f681e6e8af61-audit-dir\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.989776 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.990648 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.990874 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.992958 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d8c3b2-9459-472b-b8cb-54421193044a-service-ca-bundle\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.991090 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.990942 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.991910 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-client-ca\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.992443 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-config\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.991199 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.994068 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.994118 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51d8c3b2-9459-472b-b8cb-54421193044a-serving-cert\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.995544 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b02ef795-1c7c-49fe-bd12-00043942b97f-serving-cert\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.996028 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.996181 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-config\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.997587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-machine-approver-tls\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.997727 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753eb322-6173-4cf4-bed7-28c5ed218b3d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-csnzk\" (UID: \"753eb322-6173-4cf4-bed7-28c5ed218b3d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.997974 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt"] Oct 06 14:55:46 crc kubenswrapper[4763]: I1006 14:55:46.999664 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.000047 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.000142 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8bb70296-8159-45af-9860-f681e6e8af61-encryption-config\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.000798 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.003357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8a48-e756-488e-a468-3f328ca87a48-available-featuregates\") pod \"openshift-config-operator-7777fb866f-w65lk\" (UID: \"3bfc8a48-e756-488e-a468-3f328ca87a48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.012600 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.015485 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.018175 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.018278 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.020849 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pr9h9"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.023712 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.024496 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-w65lk"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.024590 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.026229 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.026828 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-rtxf5"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.027451 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.027800 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.028282 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mbhss"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.028525 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.029656 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7n989"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.029708 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.030931 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wqpfm"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.032083 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.032664 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.032816 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.033397 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.034954 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.036081 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xlb8f"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.037854 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hksqs"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.040937 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w72px"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.040964 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-t2f4r"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.040977 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.041025 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.041712 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.043148 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qvml8"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.045588 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-82t6f"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.046626 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmrz8"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.048004 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.049380 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.051195 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.052409 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.054529 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.056036 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-phqz6"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.057243 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.058541 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p8lxt"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.059147 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p8lxt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.059989 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.061396 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.062733 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.063892 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.065308 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9wh2c"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.067532 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.068420 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hksqs"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.069656 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vqpmv"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.070720 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.071823 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.073210 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9nc4c"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.073636 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.073667 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1dff0313-c10e-40ca-b794-ab8e85c08c8b-trusted-ca\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.073765 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.073905 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.074038 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dff0313-c10e-40ca-b794-ab8e85c08c8b-serving-cert\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.074092 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ebc102c-269a-456a-aacb-fc15bfac28f1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nrz2n\" (UID: \"6ebc102c-269a-456a-aacb-fc15bfac28f1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.075262 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1dff0313-c10e-40ca-b794-ab8e85c08c8b-trusted-ca\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.075331 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.076675 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.077096 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.077107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.077791 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.078076 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dff0313-c10e-40ca-b794-ab8e85c08c8b-serving-cert\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.077119 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ebc102c-269a-456a-aacb-fc15bfac28f1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nrz2n\" (UID: \"6ebc102c-269a-456a-aacb-fc15bfac28f1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.074551 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.078844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.078866 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.078881 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.079322 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ebc102c-269a-456a-aacb-fc15bfac28f1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nrz2n\" (UID: \"6ebc102c-269a-456a-aacb-fc15bfac28f1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.079556 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p8lxt"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.079722 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpntk\" (UniqueName: \"kubernetes.io/projected/6ebc102c-269a-456a-aacb-fc15bfac28f1-kube-api-access-kpntk\") pod \"openshift-controller-manager-operator-756b6f6bc6-nrz2n\" (UID: \"6ebc102c-269a-456a-aacb-fc15bfac28f1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.079774 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dff0313-c10e-40ca-b794-ab8e85c08c8b-config\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.080123 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.080420 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dff0313-c10e-40ca-b794-ab8e85c08c8b-config\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.080535 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.080659 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-dir\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.080694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-dir\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.080751 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.080791 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-policies\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.080815 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.081698 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhqvz\" (UniqueName: \"kubernetes.io/projected/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-kube-api-access-fhqvz\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.081097 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.081746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.081601 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.081792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.081814 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2kl\" (UniqueName: \"kubernetes.io/projected/1dff0313-c10e-40ca-b794-ab8e85c08c8b-kube-api-access-dh2kl\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.081839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.081853 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.081320 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-policies\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.082432 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.083199 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.084015 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ebc102c-269a-456a-aacb-fc15bfac28f1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nrz2n\" (UID: \"6ebc102c-269a-456a-aacb-fc15bfac28f1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.084784 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.084912 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.086871 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.089071 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.089295 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.091447 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9h5w7"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.095172 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qx94g"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.095228 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.096466 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.097358 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9h5w7"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.098378 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qx94g"] Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.109019 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.128298 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.167969 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.188459 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.209427 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.227965 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.248748 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.267952 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.289801 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.309990 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.328994 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.348860 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.369390 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.388917 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.408865 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.434851 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.448692 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.468905 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.489438 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.556795 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzzl4\" (UniqueName: \"kubernetes.io/projected/753eb322-6173-4cf4-bed7-28c5ed218b3d-kube-api-access-tzzl4\") pod \"openshift-apiserver-operator-796bbdcf4f-csnzk\" (UID: \"753eb322-6173-4cf4-bed7-28c5ed218b3d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.573976 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slhwt\" (UniqueName: \"kubernetes.io/projected/51d8c3b2-9459-472b-b8cb-54421193044a-kube-api-access-slhwt\") pod \"authentication-operator-69f744f599-phqz6\" (UID: \"51d8c3b2-9459-472b-b8cb-54421193044a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.586520 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wlxm\" (UniqueName: \"kubernetes.io/projected/8bb70296-8159-45af-9860-f681e6e8af61-kube-api-access-2wlxm\") pod \"apiserver-76f77b778f-xlb8f\" (UID: \"8bb70296-8159-45af-9860-f681e6e8af61\") " pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.608886 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.613652 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzv8k\" (UniqueName: \"kubernetes.io/projected/9ab5712d-bdff-43e3-b9ab-26d452c17259-kube-api-access-dzv8k\") pod \"downloads-7954f5f757-7n989\" (UID: \"9ab5712d-bdff-43e3-b9ab-26d452c17259\") " pod="openshift-console/downloads-7954f5f757-7n989" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.629876 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.649207 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.668676 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.688340 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.729495 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.734426 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9gj\" (UniqueName: \"kubernetes.io/projected/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-kube-api-access-nh9gj\") pod \"route-controller-manager-6576b87f9c-dh45h\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.749073 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.768989 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.777222 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.789340 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.799307 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.809370 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.809665 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.827213 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7n989" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.830221 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.872682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb54v\" (UniqueName: \"kubernetes.io/projected/b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb-kube-api-access-qb54v\") pod \"machine-approver-56656f9798-qr4jz\" (UID: \"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.888706 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.889057 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p67w6\" (UniqueName: \"kubernetes.io/projected/bd26b8e0-dc7c-4d93-bb0f-3bf9de025430-kube-api-access-p67w6\") pod \"machine-api-operator-5694c8668f-wqpfm\" (UID: \"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.936934 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.936968 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95vsl\" (UniqueName: \"kubernetes.io/projected/3bfc8a48-e756-488e-a468-3f328ca87a48-kube-api-access-95vsl\") pod \"openshift-config-operator-7777fb866f-w65lk\" (UID: \"3bfc8a48-e756-488e-a468-3f328ca87a48\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.948221 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.974907 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 06 14:55:47 crc kubenswrapper[4763]: I1006 14:55:47.988344 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.006902 4763 request.go:700] Waited for 1.013194107s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dolm-operator-serving-cert&limit=500&resourceVersion=0 Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.008356 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.013944 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.045131 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx586\" (UniqueName: \"kubernetes.io/projected/b02ef795-1c7c-49fe-bd12-00043942b97f-kube-api-access-jx586\") pod \"controller-manager-879f6c89f-pr9h9\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.049690 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.050925 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.052849 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7n989"] Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.067546 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" Oct 06 14:55:48 crc kubenswrapper[4763]: W1006 14:55:48.069118 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab5712d_bdff_43e3_b9ab_26d452c17259.slice/crio-84cc474b956b78d1492081fffeb0696cbe8898d855467990c5f32282cb17d5f9 WatchSource:0}: Error finding container 84cc474b956b78d1492081fffeb0696cbe8898d855467990c5f32282cb17d5f9: Status 404 returned error can't find the container with id 84cc474b956b78d1492081fffeb0696cbe8898d855467990c5f32282cb17d5f9 Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.069256 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.085343 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.089390 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.110634 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.130406 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.144139 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-phqz6"] Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.148202 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: W1006 14:55:48.160030 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51d8c3b2_9459_472b_b8cb_54421193044a.slice/crio-425032fbf281d45dbc0857a52b73d770300bc7fb837654750e3b67b10629d299 WatchSource:0}: Error finding container 425032fbf281d45dbc0857a52b73d770300bc7fb837654750e3b67b10629d299: Status 404 returned error can't find the container with id 425032fbf281d45dbc0857a52b73d770300bc7fb837654750e3b67b10629d299 Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.167566 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.188243 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.208151 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.230783 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h"] Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.236980 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.248544 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 06 14:55:48 crc kubenswrapper[4763]: W1006 14:55:48.261862 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a6bf75e_d386_4a0b_92ba_fa9b01645e4a.slice/crio-f9c26c69457ba6d8eeb7e4451aa02b12d463180a240f1f10a01175a13ee0e713 WatchSource:0}: Error finding container f9c26c69457ba6d8eeb7e4451aa02b12d463180a240f1f10a01175a13ee0e713: Status 404 returned error can't find the container with id f9c26c69457ba6d8eeb7e4451aa02b12d463180a240f1f10a01175a13ee0e713 Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.265254 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-w65lk"] Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.270088 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.288967 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.291177 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk"] Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.296083 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.297794 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xlb8f"] Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.306281 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wqpfm"] Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.308431 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 06 14:55:48 crc kubenswrapper[4763]: W1006 14:55:48.310770 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bb70296_8159_45af_9860_f681e6e8af61.slice/crio-8ce405271af83ab847a5dcc6abf5e838324215da308db62716cbb984635592a1 WatchSource:0}: Error finding container 8ce405271af83ab847a5dcc6abf5e838324215da308db62716cbb984635592a1: Status 404 returned error can't find the container with id 8ce405271af83ab847a5dcc6abf5e838324215da308db62716cbb984635592a1 Oct 06 14:55:48 crc kubenswrapper[4763]: W1006 14:55:48.311037 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod753eb322_6173_4cf4_bed7_28c5ed218b3d.slice/crio-70e6fd39540f465d7e42603b0b56d0306ddb04a0873c53cdee09c57bb1b0c354 WatchSource:0}: Error finding container 70e6fd39540f465d7e42603b0b56d0306ddb04a0873c53cdee09c57bb1b0c354: Status 404 returned error can't find the container with id 70e6fd39540f465d7e42603b0b56d0306ddb04a0873c53cdee09c57bb1b0c354 Oct 06 14:55:48 crc kubenswrapper[4763]: W1006 14:55:48.321489 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd26b8e0_dc7c_4d93_bb0f_3bf9de025430.slice/crio-4f96782553c65372412b99fa434137dede061311b5caa0472398028438b8221b WatchSource:0}: Error finding container 4f96782553c65372412b99fa434137dede061311b5caa0472398028438b8221b: Status 404 returned error can't find the container with id 4f96782553c65372412b99fa434137dede061311b5caa0472398028438b8221b Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.327997 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.335276 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" event={"ID":"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb","Type":"ContainerStarted","Data":"eba518061e0522f71687f34471279ae5679a8a4ad94003b4eaeb62e432f71f5f"} Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.336547 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7n989" event={"ID":"9ab5712d-bdff-43e3-b9ab-26d452c17259","Type":"ContainerStarted","Data":"740cb947bf60a6e022d52753d48a7e7a656165f01a777dd978128e06afea929a"} Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.336650 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7n989" event={"ID":"9ab5712d-bdff-43e3-b9ab-26d452c17259","Type":"ContainerStarted","Data":"84cc474b956b78d1492081fffeb0696cbe8898d855467990c5f32282cb17d5f9"} Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.336965 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7n989" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.339015 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" event={"ID":"51d8c3b2-9459-472b-b8cb-54421193044a","Type":"ContainerStarted","Data":"d7d94220b61190a7b3ff89214e44732e1db92e0fa068808a330c74554dee6550"} Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.339041 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" event={"ID":"51d8c3b2-9459-472b-b8cb-54421193044a","Type":"ContainerStarted","Data":"425032fbf281d45dbc0857a52b73d770300bc7fb837654750e3b67b10629d299"} Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.340919 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" event={"ID":"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a","Type":"ContainerStarted","Data":"f9c26c69457ba6d8eeb7e4451aa02b12d463180a240f1f10a01175a13ee0e713"} Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.341286 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-7n989 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.341321 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7n989" podUID="9ab5712d-bdff-43e3-b9ab-26d452c17259" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.343571 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" event={"ID":"8bb70296-8159-45af-9860-f681e6e8af61","Type":"ContainerStarted","Data":"8ce405271af83ab847a5dcc6abf5e838324215da308db62716cbb984635592a1"} Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.346843 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" event={"ID":"753eb322-6173-4cf4-bed7-28c5ed218b3d","Type":"ContainerStarted","Data":"70e6fd39540f465d7e42603b0b56d0306ddb04a0873c53cdee09c57bb1b0c354"} Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.348132 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" event={"ID":"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430","Type":"ContainerStarted","Data":"4f96782553c65372412b99fa434137dede061311b5caa0472398028438b8221b"} Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.348360 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.354489 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" event={"ID":"3bfc8a48-e756-488e-a468-3f328ca87a48","Type":"ContainerStarted","Data":"0dd3653bea2f746ab286a873b71b3d44f94f8ba63453ef90b6e3cce06452295e"} Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.368820 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.391825 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.408598 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.427883 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.453148 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.468266 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.489134 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.508789 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.529642 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.548925 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.567586 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.587899 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.608174 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.628314 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.648553 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.667949 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.688851 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.698009 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pr9h9"] Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.709873 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.729138 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.749436 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.771307 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.789054 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.808393 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.828430 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.848549 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.869426 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.888962 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.931111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpntk\" (UniqueName: \"kubernetes.io/projected/6ebc102c-269a-456a-aacb-fc15bfac28f1-kube-api-access-kpntk\") pod \"openshift-controller-manager-operator-756b6f6bc6-nrz2n\" (UID: \"6ebc102c-269a-456a-aacb-fc15bfac28f1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.957177 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhqvz\" (UniqueName: \"kubernetes.io/projected/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-kube-api-access-fhqvz\") pod \"oauth-openshift-558db77b4-9nc4c\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.963938 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2kl\" (UniqueName: \"kubernetes.io/projected/1dff0313-c10e-40ca-b794-ab8e85c08c8b-kube-api-access-dh2kl\") pod \"console-operator-58897d9998-w72px\" (UID: \"1dff0313-c10e-40ca-b794-ab8e85c08c8b\") " pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.969504 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 06 14:55:48 crc kubenswrapper[4763]: I1006 14:55:48.988561 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.007043 4763 request.go:700] Waited for 1.911452s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.008944 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.029260 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.050790 4763 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.068875 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.072530 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107122 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107207 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tb84\" (UniqueName: \"kubernetes.io/projected/7932c1b6-6f13-4b40-a720-2753116df818-kube-api-access-8tb84\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-tls\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107316 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6ac57b5-83eb-472e-8dba-1806968a91bf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xmg6k\" (UID: \"b6ac57b5-83eb-472e-8dba-1806968a91bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107403 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7932c1b6-6f13-4b40-a720-2753116df818-encryption-config\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107490 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7932c1b6-6f13-4b40-a720-2753116df818-audit-policies\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107541 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-certificates\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107596 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16763adf-2c76-4b80-833f-39219d9e279e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107684 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcp52\" (UniqueName: \"kubernetes.io/projected/0420693a-911e-43d1-830f-cad488328368-kube-api-access-vcp52\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107740 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrj8q\" (UniqueName: \"kubernetes.io/projected/fc281a39-2f6b-407d-a27e-0d19025186d7-kube-api-access-qrj8q\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107784 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7932c1b6-6f13-4b40-a720-2753116df818-audit-dir\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107846 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107909 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7932c1b6-6f13-4b40-a720-2753116df818-etcd-client\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.107979 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-trusted-ca\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108032 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-serving-cert\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108078 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ac57b5-83eb-472e-8dba-1806968a91bf-config\") pod \"kube-controller-manager-operator-78b949d7b-xmg6k\" (UID: \"b6ac57b5-83eb-472e-8dba-1806968a91bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108124 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-console-config\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108177 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0420693a-911e-43d1-830f-cad488328368-etcd-ca\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108224 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7932c1b6-6f13-4b40-a720-2753116df818-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108278 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6ac57b5-83eb-472e-8dba-1806968a91bf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xmg6k\" (UID: \"b6ac57b5-83eb-472e-8dba-1806968a91bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108358 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/16763adf-2c76-4b80-833f-39219d9e279e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108414 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0420693a-911e-43d1-830f-cad488328368-serving-cert\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108480 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0420693a-911e-43d1-830f-cad488328368-etcd-service-ca\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108529 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108571 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7932c1b6-6f13-4b40-a720-2753116df818-serving-cert\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108663 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-oauth-config\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108758 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-oauth-serving-cert\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108819 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xcsz\" (UniqueName: \"kubernetes.io/projected/16763adf-2c76-4b80-833f-39219d9e279e-kube-api-access-8xcsz\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108879 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16763adf-2c76-4b80-833f-39219d9e279e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.108935 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bncjz\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-kube-api-access-bncjz\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.109063 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw6rp\" (UniqueName: \"kubernetes.io/projected/db943839-6479-48dd-baa3-e23662c89494-kube-api-access-xw6rp\") pod \"cluster-samples-operator-665b6dd947-6mbnl\" (UID: \"db943839-6479-48dd-baa3-e23662c89494\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.109185 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-service-ca\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.109237 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-trusted-ca-bundle\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.109316 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0420693a-911e-43d1-830f-cad488328368-etcd-client\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.109362 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7932c1b6-6f13-4b40-a720-2753116df818-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.109417 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db943839-6479-48dd-baa3-e23662c89494-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6mbnl\" (UID: \"db943839-6479-48dd-baa3-e23662c89494\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.109492 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0420693a-911e-43d1-830f-cad488328368-config\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.109545 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-bound-sa-token\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: E1006 14:55:49.111172 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:49.611156433 +0000 UTC m=+146.766448955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.138904 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.165381 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210155 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210317 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b243bd7-367c-41e7-9101-981ed6d10a13-secret-volume\") pod \"collect-profiles-29329365-nptf5\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-socket-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210367 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16763adf-2c76-4b80-833f-39219d9e279e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xcsz\" (UniqueName: \"kubernetes.io/projected/16763adf-2c76-4b80-833f-39219d9e279e-kube-api-access-8xcsz\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210398 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slcz4\" (UniqueName: \"kubernetes.io/projected/6b243bd7-367c-41e7-9101-981ed6d10a13-kube-api-access-slcz4\") pod \"collect-profiles-29329365-nptf5\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210422 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bncjz\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-kube-api-access-bncjz\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210440 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxtv2\" (UniqueName: \"kubernetes.io/projected/1196c738-baee-48da-b414-986e510b81c2-kube-api-access-jxtv2\") pod \"service-ca-9c57cc56f-vqpmv\" (UID: \"1196c738-baee-48da-b414-986e510b81c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210455 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98a04d2-8411-4a25-b4ae-030305025e74-config\") pod \"service-ca-operator-777779d784-7zwvq\" (UID: \"f98a04d2-8411-4a25-b4ae-030305025e74\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210481 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4866322c-ee16-49bd-a037-17189761a083-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zphtt\" (UID: \"4866322c-ee16-49bd-a037-17189761a083\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210495 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40559ba2-0471-4c13-9c3d-4b3bc6d60739-cert\") pod \"ingress-canary-p8lxt\" (UID: \"40559ba2-0471-4c13-9c3d-4b3bc6d60739\") " pod="openshift-ingress-canary/ingress-canary-p8lxt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210524 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw6rp\" (UniqueName: \"kubernetes.io/projected/db943839-6479-48dd-baa3-e23662c89494-kube-api-access-xw6rp\") pod \"cluster-samples-operator-665b6dd947-6mbnl\" (UID: \"db943839-6479-48dd-baa3-e23662c89494\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210540 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-mountpoint-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210567 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c-profile-collector-cert\") pod \"catalog-operator-68c6474976-6b2lk\" (UID: \"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210582 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68r67\" (UniqueName: \"kubernetes.io/projected/f551caab-0911-4a99-b78f-b6c9dd198d2e-kube-api-access-68r67\") pod \"multus-admission-controller-857f4d67dd-hksqs\" (UID: \"f551caab-0911-4a99-b78f-b6c9dd198d2e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210597 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81102bbe-a463-45e8-9fb2-dd27f0756db8-metrics-tls\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210627 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-service-ca\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210641 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-trusted-ca-bundle\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210659 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnb4x\" (UniqueName: \"kubernetes.io/projected/13deabc5-432a-4064-9d09-e17b6d11701a-kube-api-access-dnb4x\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8gg4\" (UID: \"13deabc5-432a-4064-9d09-e17b6d11701a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210674 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86h72\" (UniqueName: \"kubernetes.io/projected/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-kube-api-access-86h72\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210691 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7932c1b6-6f13-4b40-a720-2753116df818-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210715 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e684b841-47ba-4714-aca1-2fc94976ea11-webhook-cert\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210731 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/16d0552a-c067-45c4-a3c4-93ab3a2455ec-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-d9hfk\" (UID: \"16d0552a-c067-45c4-a3c4-93ab3a2455ec\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210762 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db943839-6479-48dd-baa3-e23662c89494-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6mbnl\" (UID: \"db943839-6479-48dd-baa3-e23662c89494\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210785 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-bound-sa-token\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210801 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0420693a-911e-43d1-830f-cad488328368-config\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210816 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-csi-data-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210841 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75xgn\" (UniqueName: \"kubernetes.io/projected/ef7f0e05-29c2-4890-8d13-0466593e1fa8-kube-api-access-75xgn\") pod \"marketplace-operator-79b997595-zmrz8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210876 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6ac57b5-83eb-472e-8dba-1806968a91bf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xmg6k\" (UID: \"b6ac57b5-83eb-472e-8dba-1806968a91bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210892 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tb84\" (UniqueName: \"kubernetes.io/projected/7932c1b6-6f13-4b40-a720-2753116df818-kube-api-access-8tb84\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210908 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c-srv-cert\") pod \"catalog-operator-68c6474976-6b2lk\" (UID: \"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210923 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29glw\" (UniqueName: \"kubernetes.io/projected/a068e2ad-5184-4bdd-9585-10a5451a7c3f-kube-api-access-29glw\") pod \"dns-default-9h5w7\" (UID: \"a068e2ad-5184-4bdd-9585-10a5451a7c3f\") " pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210948 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59b2923d-9f63-412c-8f41-5f92f9258163-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6sqtv\" (UID: \"59b2923d-9f63-412c-8f41-5f92f9258163\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.210964 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-images\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211002 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e684b841-47ba-4714-aca1-2fc94976ea11-tmpfs\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211019 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16763adf-2c76-4b80-833f-39219d9e279e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211033 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcp52\" (UniqueName: \"kubernetes.io/projected/0420693a-911e-43d1-830f-cad488328368-kube-api-access-vcp52\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211048 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13deabc5-432a-4064-9d09-e17b6d11701a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8gg4\" (UID: \"13deabc5-432a-4064-9d09-e17b6d11701a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211061 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81102bbe-a463-45e8-9fb2-dd27f0756db8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211088 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e684b841-47ba-4714-aca1-2fc94976ea11-apiservice-cert\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211103 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5jbm\" (UniqueName: \"kubernetes.io/projected/b01c4848-8d37-4d3d-928c-c627cb4f3890-kube-api-access-b5jbm\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211121 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrj8q\" (UniqueName: \"kubernetes.io/projected/fc281a39-2f6b-407d-a27e-0d19025186d7-kube-api-access-qrj8q\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7932c1b6-6f13-4b40-a720-2753116df818-audit-dir\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211161 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxglk\" (UniqueName: \"kubernetes.io/projected/e684b841-47ba-4714-aca1-2fc94976ea11-kube-api-access-hxglk\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211175 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-plugins-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211190 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3da11f04-62a5-45e7-9c88-4d9a39fff2d3-node-bootstrap-token\") pod \"machine-config-server-mbhss\" (UID: \"3da11f04-62a5-45e7-9c88-4d9a39fff2d3\") " pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211205 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqvfq\" (UniqueName: \"kubernetes.io/projected/d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8-kube-api-access-nqvfq\") pod \"dns-operator-744455d44c-82t6f\" (UID: \"d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8\") " pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211220 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7932c1b6-6f13-4b40-a720-2753116df818-etcd-client\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211235 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00fdffba-67a3-4bfa-a996-c13403e0631a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8sn9\" (UID: \"00fdffba-67a3-4bfa-a996-c13403e0631a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211251 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-serving-cert\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211276 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7932c1b6-6f13-4b40-a720-2753116df818-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zmrz8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211307 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b243bd7-367c-41e7-9101-981ed6d10a13-config-volume\") pod \"collect-profiles-29329365-nptf5\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xmdg\" (UniqueName: \"kubernetes.io/projected/f98a04d2-8411-4a25-b4ae-030305025e74-kube-api-access-8xmdg\") pod \"service-ca-operator-777779d784-7zwvq\" (UID: \"f98a04d2-8411-4a25-b4ae-030305025e74\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211359 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6ac57b5-83eb-472e-8dba-1806968a91bf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xmg6k\" (UID: \"b6ac57b5-83eb-472e-8dba-1806968a91bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211384 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211406 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-oauth-config\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211423 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3da11f04-62a5-45e7-9c88-4d9a39fff2d3-certs\") pod \"machine-config-server-mbhss\" (UID: \"3da11f04-62a5-45e7-9c88-4d9a39fff2d3\") " pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211437 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8-metrics-tls\") pod \"dns-operator-744455d44c-82t6f\" (UID: \"d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8\") " pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211453 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-oauth-serving-cert\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211467 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zmrz8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211481 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a068e2ad-5184-4bdd-9585-10a5451a7c3f-metrics-tls\") pod \"dns-default-9h5w7\" (UID: \"a068e2ad-5184-4bdd-9585-10a5451a7c3f\") " pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211522 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/519076e5-c5f8-4122-aa97-7941d40204dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8t2s\" (UID: \"519076e5-c5f8-4122-aa97-7941d40204dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0420693a-911e-43d1-830f-cad488328368-etcd-client\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211558 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxf9\" (UniqueName: \"kubernetes.io/projected/40559ba2-0471-4c13-9c3d-4b3bc6d60739-kube-api-access-8kxf9\") pod \"ingress-canary-p8lxt\" (UID: \"40559ba2-0471-4c13-9c3d-4b3bc6d60739\") " pod="openshift-ingress-canary/ingress-canary-p8lxt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211600 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1196c738-baee-48da-b414-986e510b81c2-signing-cabundle\") pod \"service-ca-9c57cc56f-vqpmv\" (UID: \"1196c738-baee-48da-b414-986e510b81c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211657 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/37cd70e6-8b4c-443e-a91d-a76463388fb7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cfn9t\" (UID: \"37cd70e6-8b4c-443e-a91d-a76463388fb7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211674 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6498w\" (UniqueName: \"kubernetes.io/projected/37cd70e6-8b4c-443e-a91d-a76463388fb7-kube-api-access-6498w\") pod \"olm-operator-6b444d44fb-cfn9t\" (UID: \"37cd70e6-8b4c-443e-a91d-a76463388fb7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211689 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdd5j\" (UniqueName: \"kubernetes.io/projected/8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c-kube-api-access-rdd5j\") pod \"catalog-operator-68c6474976-6b2lk\" (UID: \"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211703 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f551caab-0911-4a99-b78f-b6c9dd198d2e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hksqs\" (UID: \"f551caab-0911-4a99-b78f-b6c9dd198d2e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211720 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211745 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-tls\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211760 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7932c1b6-6f13-4b40-a720-2753116df818-encryption-config\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211775 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t24zm\" (UniqueName: \"kubernetes.io/projected/81102bbe-a463-45e8-9fb2-dd27f0756db8-kube-api-access-t24zm\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211791 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00fdffba-67a3-4bfa-a996-c13403e0631a-config\") pod \"kube-apiserver-operator-766d6c64bb-t8sn9\" (UID: \"00fdffba-67a3-4bfa-a996-c13403e0631a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211807 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7932c1b6-6f13-4b40-a720-2753116df818-audit-policies\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211820 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4866322c-ee16-49bd-a037-17189761a083-proxy-tls\") pod \"machine-config-controller-84d6567774-zphtt\" (UID: \"4866322c-ee16-49bd-a037-17189761a083\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211835 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxfqd\" (UniqueName: \"kubernetes.io/projected/4866322c-ee16-49bd-a037-17189761a083-kube-api-access-dxfqd\") pod \"machine-config-controller-84d6567774-zphtt\" (UID: \"4866322c-ee16-49bd-a037-17189761a083\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211850 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-default-certificate\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz5df\" (UniqueName: \"kubernetes.io/projected/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-kube-api-access-nz5df\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211891 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-certificates\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211908 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b2923d-9f63-412c-8f41-5f92f9258163-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6sqtv\" (UID: \"59b2923d-9f63-412c-8f41-5f92f9258163\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-service-ca-bundle\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211955 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ql9w\" (UniqueName: \"kubernetes.io/projected/519076e5-c5f8-4122-aa97-7941d40204dc-kube-api-access-6ql9w\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8t2s\" (UID: \"519076e5-c5f8-4122-aa97-7941d40204dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211971 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-registration-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.211984 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-proxy-tls\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212005 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212020 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1196c738-baee-48da-b414-986e510b81c2-signing-key\") pod \"service-ca-9c57cc56f-vqpmv\" (UID: \"1196c738-baee-48da-b414-986e510b81c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212043 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-trusted-ca\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212057 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81102bbe-a463-45e8-9fb2-dd27f0756db8-trusted-ca\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212073 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-metrics-certs\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212088 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13deabc5-432a-4064-9d09-e17b6d11701a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8gg4\" (UID: \"13deabc5-432a-4064-9d09-e17b6d11701a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212105 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0420693a-911e-43d1-830f-cad488328368-etcd-ca\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212121 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ac57b5-83eb-472e-8dba-1806968a91bf-config\") pod \"kube-controller-manager-operator-78b949d7b-xmg6k\" (UID: \"b6ac57b5-83eb-472e-8dba-1806968a91bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212138 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-console-config\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212152 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b2923d-9f63-412c-8f41-5f92f9258163-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6sqtv\" (UID: \"59b2923d-9f63-412c-8f41-5f92f9258163\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212170 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00fdffba-67a3-4bfa-a996-c13403e0631a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8sn9\" (UID: \"00fdffba-67a3-4bfa-a996-c13403e0631a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212185 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/37cd70e6-8b4c-443e-a91d-a76463388fb7-srv-cert\") pod \"olm-operator-6b444d44fb-cfn9t\" (UID: \"37cd70e6-8b4c-443e-a91d-a76463388fb7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212202 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/16763adf-2c76-4b80-833f-39219d9e279e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212217 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f98a04d2-8411-4a25-b4ae-030305025e74-serving-cert\") pod \"service-ca-operator-777779d784-7zwvq\" (UID: \"f98a04d2-8411-4a25-b4ae-030305025e74\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212235 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4598\" (UniqueName: \"kubernetes.io/projected/f421ec41-8fa4-4d6f-a018-75409ad7dd84-kube-api-access-m4598\") pod \"migrator-59844c95c7-xffns\" (UID: \"f421ec41-8fa4-4d6f-a018-75409ad7dd84\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.212259 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0420693a-911e-43d1-830f-cad488328368-serving-cert\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.213314 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a068e2ad-5184-4bdd-9585-10a5451a7c3f-config-volume\") pod \"dns-default-9h5w7\" (UID: \"a068e2ad-5184-4bdd-9585-10a5451a7c3f\") " pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.213367 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0420693a-911e-43d1-830f-cad488328368-etcd-service-ca\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.213396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-stats-auth\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.213422 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7932c1b6-6f13-4b40-a720-2753116df818-serving-cert\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.213452 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb2gv\" (UniqueName: \"kubernetes.io/projected/3da11f04-62a5-45e7-9c88-4d9a39fff2d3-kube-api-access-fb2gv\") pod \"machine-config-server-mbhss\" (UID: \"3da11f04-62a5-45e7-9c88-4d9a39fff2d3\") " pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.213478 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcwxm\" (UniqueName: \"kubernetes.io/projected/16d0552a-c067-45c4-a3c4-93ab3a2455ec-kube-api-access-mcwxm\") pod \"package-server-manager-789f6589d5-d9hfk\" (UID: \"16d0552a-c067-45c4-a3c4-93ab3a2455ec\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.214226 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0420693a-911e-43d1-830f-cad488328368-etcd-service-ca\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.215366 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0420693a-911e-43d1-830f-cad488328368-config\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.215981 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7932c1b6-6f13-4b40-a720-2753116df818-etcd-client\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.217141 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db943839-6479-48dd-baa3-e23662c89494-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6mbnl\" (UID: \"db943839-6479-48dd-baa3-e23662c89494\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.218868 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16763adf-2c76-4b80-833f-39219d9e279e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.219177 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-tls\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.219360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7932c1b6-6f13-4b40-a720-2753116df818-encryption-config\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.219484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7932c1b6-6f13-4b40-a720-2753116df818-audit-dir\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.219647 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7932c1b6-6f13-4b40-a720-2753116df818-serving-cert\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.220291 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7932c1b6-6f13-4b40-a720-2753116df818-audit-policies\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.220675 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-service-ca\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.221192 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7932c1b6-6f13-4b40-a720-2753116df818-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.221396 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-trusted-ca-bundle\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.221991 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7932c1b6-6f13-4b40-a720-2753116df818-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: E1006 14:55:49.222695 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:49.722680588 +0000 UTC m=+146.877973100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.223571 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0420693a-911e-43d1-830f-cad488328368-etcd-ca\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.223854 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-trusted-ca\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.224066 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ac57b5-83eb-472e-8dba-1806968a91bf-config\") pod \"kube-controller-manager-operator-78b949d7b-xmg6k\" (UID: \"b6ac57b5-83eb-472e-8dba-1806968a91bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.224374 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-console-config\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.225353 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.225912 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-oauth-serving-cert\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.226179 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-certificates\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.227861 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/16763adf-2c76-4b80-833f-39219d9e279e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.229552 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0420693a-911e-43d1-830f-cad488328368-serving-cert\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.230427 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-serving-cert\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.232553 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6ac57b5-83eb-472e-8dba-1806968a91bf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xmg6k\" (UID: \"b6ac57b5-83eb-472e-8dba-1806968a91bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.232872 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-oauth-config\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.234912 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0420693a-911e-43d1-830f-cad488328368-etcd-client\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.235572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.244890 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-bound-sa-token\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.267887 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6ac57b5-83eb-472e-8dba-1806968a91bf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xmg6k\" (UID: \"b6ac57b5-83eb-472e-8dba-1806968a91bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.285865 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tb84\" (UniqueName: \"kubernetes.io/projected/7932c1b6-6f13-4b40-a720-2753116df818-kube-api-access-8tb84\") pod \"apiserver-7bbb656c7d-jmmk4\" (UID: \"7932c1b6-6f13-4b40-a720-2753116df818\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.297663 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9nc4c"] Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.303738 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xcsz\" (UniqueName: \"kubernetes.io/projected/16763adf-2c76-4b80-833f-39219d9e279e-kube-api-access-8xcsz\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314289 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/519076e5-c5f8-4122-aa97-7941d40204dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8t2s\" (UID: \"519076e5-c5f8-4122-aa97-7941d40204dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314546 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxf9\" (UniqueName: \"kubernetes.io/projected/40559ba2-0471-4c13-9c3d-4b3bc6d60739-kube-api-access-8kxf9\") pod \"ingress-canary-p8lxt\" (UID: \"40559ba2-0471-4c13-9c3d-4b3bc6d60739\") " pod="openshift-ingress-canary/ingress-canary-p8lxt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314562 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1196c738-baee-48da-b414-986e510b81c2-signing-cabundle\") pod \"service-ca-9c57cc56f-vqpmv\" (UID: \"1196c738-baee-48da-b414-986e510b81c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/37cd70e6-8b4c-443e-a91d-a76463388fb7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cfn9t\" (UID: \"37cd70e6-8b4c-443e-a91d-a76463388fb7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314595 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6498w\" (UniqueName: \"kubernetes.io/projected/37cd70e6-8b4c-443e-a91d-a76463388fb7-kube-api-access-6498w\") pod \"olm-operator-6b444d44fb-cfn9t\" (UID: \"37cd70e6-8b4c-443e-a91d-a76463388fb7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314625 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdd5j\" (UniqueName: \"kubernetes.io/projected/8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c-kube-api-access-rdd5j\") pod \"catalog-operator-68c6474976-6b2lk\" (UID: \"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314642 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f551caab-0911-4a99-b78f-b6c9dd198d2e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hksqs\" (UID: \"f551caab-0911-4a99-b78f-b6c9dd198d2e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314660 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t24zm\" (UniqueName: \"kubernetes.io/projected/81102bbe-a463-45e8-9fb2-dd27f0756db8-kube-api-access-t24zm\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00fdffba-67a3-4bfa-a996-c13403e0631a-config\") pod \"kube-apiserver-operator-766d6c64bb-t8sn9\" (UID: \"00fdffba-67a3-4bfa-a996-c13403e0631a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314691 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4866322c-ee16-49bd-a037-17189761a083-proxy-tls\") pod \"machine-config-controller-84d6567774-zphtt\" (UID: \"4866322c-ee16-49bd-a037-17189761a083\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314705 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxfqd\" (UniqueName: \"kubernetes.io/projected/4866322c-ee16-49bd-a037-17189761a083-kube-api-access-dxfqd\") pod \"machine-config-controller-84d6567774-zphtt\" (UID: \"4866322c-ee16-49bd-a037-17189761a083\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314719 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-default-certificate\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314737 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz5df\" (UniqueName: \"kubernetes.io/projected/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-kube-api-access-nz5df\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314751 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b2923d-9f63-412c-8f41-5f92f9258163-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6sqtv\" (UID: \"59b2923d-9f63-412c-8f41-5f92f9258163\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314766 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-service-ca-bundle\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314785 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ql9w\" (UniqueName: \"kubernetes.io/projected/519076e5-c5f8-4122-aa97-7941d40204dc-kube-api-access-6ql9w\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8t2s\" (UID: \"519076e5-c5f8-4122-aa97-7941d40204dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314799 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-registration-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314819 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-proxy-tls\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314872 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1196c738-baee-48da-b414-986e510b81c2-signing-key\") pod \"service-ca-9c57cc56f-vqpmv\" (UID: \"1196c738-baee-48da-b414-986e510b81c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314893 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81102bbe-a463-45e8-9fb2-dd27f0756db8-trusted-ca\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314909 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-metrics-certs\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314923 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b2923d-9f63-412c-8f41-5f92f9258163-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6sqtv\" (UID: \"59b2923d-9f63-412c-8f41-5f92f9258163\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314940 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00fdffba-67a3-4bfa-a996-c13403e0631a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8sn9\" (UID: \"00fdffba-67a3-4bfa-a996-c13403e0631a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314954 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/37cd70e6-8b4c-443e-a91d-a76463388fb7-srv-cert\") pod \"olm-operator-6b444d44fb-cfn9t\" (UID: \"37cd70e6-8b4c-443e-a91d-a76463388fb7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314971 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13deabc5-432a-4064-9d09-e17b6d11701a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8gg4\" (UID: \"13deabc5-432a-4064-9d09-e17b6d11701a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.314988 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f98a04d2-8411-4a25-b4ae-030305025e74-serving-cert\") pod \"service-ca-operator-777779d784-7zwvq\" (UID: \"f98a04d2-8411-4a25-b4ae-030305025e74\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315007 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4598\" (UniqueName: \"kubernetes.io/projected/f421ec41-8fa4-4d6f-a018-75409ad7dd84-kube-api-access-m4598\") pod \"migrator-59844c95c7-xffns\" (UID: \"f421ec41-8fa4-4d6f-a018-75409ad7dd84\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315024 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a068e2ad-5184-4bdd-9585-10a5451a7c3f-config-volume\") pod \"dns-default-9h5w7\" (UID: \"a068e2ad-5184-4bdd-9585-10a5451a7c3f\") " pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315039 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-stats-auth\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315055 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2gv\" (UniqueName: \"kubernetes.io/projected/3da11f04-62a5-45e7-9c88-4d9a39fff2d3-kube-api-access-fb2gv\") pod \"machine-config-server-mbhss\" (UID: \"3da11f04-62a5-45e7-9c88-4d9a39fff2d3\") " pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315073 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcwxm\" (UniqueName: \"kubernetes.io/projected/16d0552a-c067-45c4-a3c4-93ab3a2455ec-kube-api-access-mcwxm\") pod \"package-server-manager-789f6589d5-d9hfk\" (UID: \"16d0552a-c067-45c4-a3c4-93ab3a2455ec\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315096 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b243bd7-367c-41e7-9101-981ed6d10a13-secret-volume\") pod \"collect-profiles-29329365-nptf5\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315109 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-socket-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315124 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slcz4\" (UniqueName: \"kubernetes.io/projected/6b243bd7-367c-41e7-9101-981ed6d10a13-kube-api-access-slcz4\") pod \"collect-profiles-29329365-nptf5\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315145 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxtv2\" (UniqueName: \"kubernetes.io/projected/1196c738-baee-48da-b414-986e510b81c2-kube-api-access-jxtv2\") pod \"service-ca-9c57cc56f-vqpmv\" (UID: \"1196c738-baee-48da-b414-986e510b81c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315159 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98a04d2-8411-4a25-b4ae-030305025e74-config\") pod \"service-ca-operator-777779d784-7zwvq\" (UID: \"f98a04d2-8411-4a25-b4ae-030305025e74\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315176 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4866322c-ee16-49bd-a037-17189761a083-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zphtt\" (UID: \"4866322c-ee16-49bd-a037-17189761a083\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315189 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40559ba2-0471-4c13-9c3d-4b3bc6d60739-cert\") pod \"ingress-canary-p8lxt\" (UID: \"40559ba2-0471-4c13-9c3d-4b3bc6d60739\") " pod="openshift-ingress-canary/ingress-canary-p8lxt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315211 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-mountpoint-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315228 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c-profile-collector-cert\") pod \"catalog-operator-68c6474976-6b2lk\" (UID: \"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315242 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68r67\" (UniqueName: \"kubernetes.io/projected/f551caab-0911-4a99-b78f-b6c9dd198d2e-kube-api-access-68r67\") pod \"multus-admission-controller-857f4d67dd-hksqs\" (UID: \"f551caab-0911-4a99-b78f-b6c9dd198d2e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315259 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81102bbe-a463-45e8-9fb2-dd27f0756db8-metrics-tls\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315270 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1196c738-baee-48da-b414-986e510b81c2-signing-cabundle\") pod \"service-ca-9c57cc56f-vqpmv\" (UID: \"1196c738-baee-48da-b414-986e510b81c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315275 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnb4x\" (UniqueName: \"kubernetes.io/projected/13deabc5-432a-4064-9d09-e17b6d11701a-kube-api-access-dnb4x\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8gg4\" (UID: \"13deabc5-432a-4064-9d09-e17b6d11701a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315558 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86h72\" (UniqueName: \"kubernetes.io/projected/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-kube-api-access-86h72\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315580 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e684b841-47ba-4714-aca1-2fc94976ea11-webhook-cert\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315598 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/16d0552a-c067-45c4-a3c4-93ab3a2455ec-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-d9hfk\" (UID: \"16d0552a-c067-45c4-a3c4-93ab3a2455ec\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315643 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-csi-data-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315663 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75xgn\" (UniqueName: \"kubernetes.io/projected/ef7f0e05-29c2-4890-8d13-0466593e1fa8-kube-api-access-75xgn\") pod \"marketplace-operator-79b997595-zmrz8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315687 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c-srv-cert\") pod \"catalog-operator-68c6474976-6b2lk\" (UID: \"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315703 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29glw\" (UniqueName: \"kubernetes.io/projected/a068e2ad-5184-4bdd-9585-10a5451a7c3f-kube-api-access-29glw\") pod \"dns-default-9h5w7\" (UID: \"a068e2ad-5184-4bdd-9585-10a5451a7c3f\") " pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315718 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59b2923d-9f63-412c-8f41-5f92f9258163-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6sqtv\" (UID: \"59b2923d-9f63-412c-8f41-5f92f9258163\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-images\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315757 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e684b841-47ba-4714-aca1-2fc94976ea11-tmpfs\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315778 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-service-ca-bundle\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315784 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13deabc5-432a-4064-9d09-e17b6d11701a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8gg4\" (UID: \"13deabc5-432a-4064-9d09-e17b6d11701a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.316774 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00fdffba-67a3-4bfa-a996-c13403e0631a-config\") pod \"kube-apiserver-operator-766d6c64bb-t8sn9\" (UID: \"00fdffba-67a3-4bfa-a996-c13403e0631a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.320039 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e684b841-47ba-4714-aca1-2fc94976ea11-webhook-cert\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.322067 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4866322c-ee16-49bd-a037-17189761a083-proxy-tls\") pod \"machine-config-controller-84d6567774-zphtt\" (UID: \"4866322c-ee16-49bd-a037-17189761a083\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.322632 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-proxy-tls\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.323145 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.324503 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/519076e5-c5f8-4122-aa97-7941d40204dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8t2s\" (UID: \"519076e5-c5f8-4122-aa97-7941d40204dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.326321 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e684b841-47ba-4714-aca1-2fc94976ea11-tmpfs\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.326392 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-csi-data-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.327923 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81102bbe-a463-45e8-9fb2-dd27f0756db8-trusted-ca\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.329523 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81102bbe-a463-45e8-9fb2-dd27f0756db8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.315739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-registration-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.330228 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-default-certificate\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.330306 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-socket-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.330335 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e684b841-47ba-4714-aca1-2fc94976ea11-apiservice-cert\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.330354 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5jbm\" (UniqueName: \"kubernetes.io/projected/b01c4848-8d37-4d3d-928c-c627cb4f3890-kube-api-access-b5jbm\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.330383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxglk\" (UniqueName: \"kubernetes.io/projected/e684b841-47ba-4714-aca1-2fc94976ea11-kube-api-access-hxglk\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.331799 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-images\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.331380 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98a04d2-8411-4a25-b4ae-030305025e74-config\") pod \"service-ca-operator-777779d784-7zwvq\" (UID: \"f98a04d2-8411-4a25-b4ae-030305025e74\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:49 crc kubenswrapper[4763]: E1006 14:55:49.332152 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:49.832134352 +0000 UTC m=+146.987426864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.333439 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a068e2ad-5184-4bdd-9585-10a5451a7c3f-config-volume\") pod \"dns-default-9h5w7\" (UID: \"a068e2ad-5184-4bdd-9585-10a5451a7c3f\") " pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.334193 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f551caab-0911-4a99-b78f-b6c9dd198d2e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hksqs\" (UID: \"f551caab-0911-4a99-b78f-b6c9dd198d2e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.334561 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b2923d-9f63-412c-8f41-5f92f9258163-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6sqtv\" (UID: \"59b2923d-9f63-412c-8f41-5f92f9258163\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.334850 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13deabc5-432a-4064-9d09-e17b6d11701a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8gg4\" (UID: \"13deabc5-432a-4064-9d09-e17b6d11701a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.335528 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/16d0552a-c067-45c4-a3c4-93ab3a2455ec-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-d9hfk\" (UID: \"16d0552a-c067-45c4-a3c4-93ab3a2455ec\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.335743 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1196c738-baee-48da-b414-986e510b81c2-signing-key\") pod \"service-ca-9c57cc56f-vqpmv\" (UID: \"1196c738-baee-48da-b414-986e510b81c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.336047 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-plugins-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.331368 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16763adf-2c76-4b80-833f-39219d9e279e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h5zr7\" (UID: \"16763adf-2c76-4b80-833f-39219d9e279e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.336256 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3da11f04-62a5-45e7-9c88-4d9a39fff2d3-node-bootstrap-token\") pod \"machine-config-server-mbhss\" (UID: \"3da11f04-62a5-45e7-9c88-4d9a39fff2d3\") " pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.336336 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqvfq\" (UniqueName: \"kubernetes.io/projected/d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8-kube-api-access-nqvfq\") pod \"dns-operator-744455d44c-82t6f\" (UID: \"d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8\") " pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.336420 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00fdffba-67a3-4bfa-a996-c13403e0631a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8sn9\" (UID: \"00fdffba-67a3-4bfa-a996-c13403e0631a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.336598 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zmrz8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.336671 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b243bd7-367c-41e7-9101-981ed6d10a13-config-volume\") pod \"collect-profiles-29329365-nptf5\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.336857 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-plugins-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.337522 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4866322c-ee16-49bd-a037-17189761a083-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zphtt\" (UID: \"4866322c-ee16-49bd-a037-17189761a083\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.337571 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b01c4848-8d37-4d3d-928c-c627cb4f3890-mountpoint-dir\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.337960 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13deabc5-432a-4064-9d09-e17b6d11701a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8gg4\" (UID: \"13deabc5-432a-4064-9d09-e17b6d11701a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.338283 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xmdg\" (UniqueName: \"kubernetes.io/projected/f98a04d2-8411-4a25-b4ae-030305025e74-kube-api-access-8xmdg\") pod \"service-ca-operator-777779d784-7zwvq\" (UID: \"f98a04d2-8411-4a25-b4ae-030305025e74\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.338388 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3da11f04-62a5-45e7-9c88-4d9a39fff2d3-certs\") pod \"machine-config-server-mbhss\" (UID: \"3da11f04-62a5-45e7-9c88-4d9a39fff2d3\") " pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.338427 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b243bd7-367c-41e7-9101-981ed6d10a13-config-volume\") pod \"collect-profiles-29329365-nptf5\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.338438 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8-metrics-tls\") pod \"dns-operator-744455d44c-82t6f\" (UID: \"d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8\") " pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.338469 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zmrz8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.338633 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a068e2ad-5184-4bdd-9585-10a5451a7c3f-metrics-tls\") pod \"dns-default-9h5w7\" (UID: \"a068e2ad-5184-4bdd-9585-10a5451a7c3f\") " pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.343770 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-stats-auth\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.344029 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b243bd7-367c-41e7-9101-981ed6d10a13-secret-volume\") pod \"collect-profiles-29329365-nptf5\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.344367 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f98a04d2-8411-4a25-b4ae-030305025e74-serving-cert\") pod \"service-ca-operator-777779d784-7zwvq\" (UID: \"f98a04d2-8411-4a25-b4ae-030305025e74\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.344486 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/37cd70e6-8b4c-443e-a91d-a76463388fb7-srv-cert\") pod \"olm-operator-6b444d44fb-cfn9t\" (UID: \"37cd70e6-8b4c-443e-a91d-a76463388fb7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.344850 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81102bbe-a463-45e8-9fb2-dd27f0756db8-metrics-tls\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.344969 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00fdffba-67a3-4bfa-a996-c13403e0631a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8sn9\" (UID: \"00fdffba-67a3-4bfa-a996-c13403e0631a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.344995 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c-srv-cert\") pod \"catalog-operator-68c6474976-6b2lk\" (UID: \"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.345346 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b2923d-9f63-412c-8f41-5f92f9258163-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6sqtv\" (UID: \"59b2923d-9f63-412c-8f41-5f92f9258163\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.345943 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcp52\" (UniqueName: \"kubernetes.io/projected/0420693a-911e-43d1-830f-cad488328368-kube-api-access-vcp52\") pod \"etcd-operator-b45778765-qvml8\" (UID: \"0420693a-911e-43d1-830f-cad488328368\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.346129 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-metrics-certs\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.346680 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c-profile-collector-cert\") pod \"catalog-operator-68c6474976-6b2lk\" (UID: \"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.351296 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8-metrics-tls\") pod \"dns-operator-744455d44c-82t6f\" (UID: \"d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8\") " pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.353244 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zmrz8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.354145 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3da11f04-62a5-45e7-9c88-4d9a39fff2d3-node-bootstrap-token\") pod \"machine-config-server-mbhss\" (UID: \"3da11f04-62a5-45e7-9c88-4d9a39fff2d3\") " pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.354442 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zmrz8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.354598 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40559ba2-0471-4c13-9c3d-4b3bc6d60739-cert\") pod \"ingress-canary-p8lxt\" (UID: \"40559ba2-0471-4c13-9c3d-4b3bc6d60739\") " pod="openshift-ingress-canary/ingress-canary-p8lxt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.352805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a068e2ad-5184-4bdd-9585-10a5451a7c3f-metrics-tls\") pod \"dns-default-9h5w7\" (UID: \"a068e2ad-5184-4bdd-9585-10a5451a7c3f\") " pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.360869 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/37cd70e6-8b4c-443e-a91d-a76463388fb7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cfn9t\" (UID: \"37cd70e6-8b4c-443e-a91d-a76463388fb7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.362746 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3da11f04-62a5-45e7-9c88-4d9a39fff2d3-certs\") pod \"machine-config-server-mbhss\" (UID: \"3da11f04-62a5-45e7-9c88-4d9a39fff2d3\") " pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.364774 4763 generic.go:334] "Generic (PLEG): container finished" podID="3bfc8a48-e756-488e-a468-3f328ca87a48" containerID="d8264232f5a3cf65d0c2f79b55923b9f43df0b1b2ec04675bc97c852b5786bb8" exitCode=0 Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.364844 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" event={"ID":"3bfc8a48-e756-488e-a468-3f328ca87a48","Type":"ContainerDied","Data":"d8264232f5a3cf65d0c2f79b55923b9f43df0b1b2ec04675bc97c852b5786bb8"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.371275 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e684b841-47ba-4714-aca1-2fc94976ea11-apiservice-cert\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.376989 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" event={"ID":"753eb322-6173-4cf4-bed7-28c5ed218b3d","Type":"ContainerStarted","Data":"6bf6cbb560cc6869bd7769ebdb25d3c47cea83508dc1c8ea3020f3d27898e6ab"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.379889 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" event={"ID":"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d","Type":"ContainerStarted","Data":"fe9f6a743b6c34135d2fed33fbf4adf524527f191522f5bc0a303f22887196ab"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.381589 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrj8q\" (UniqueName: \"kubernetes.io/projected/fc281a39-2f6b-407d-a27e-0d19025186d7-kube-api-access-qrj8q\") pod \"console-f9d7485db-t2f4r\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.383938 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" event={"ID":"b02ef795-1c7c-49fe-bd12-00043942b97f","Type":"ContainerStarted","Data":"24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.383978 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" event={"ID":"b02ef795-1c7c-49fe-bd12-00043942b97f","Type":"ContainerStarted","Data":"761eca6bdb67c1d45da3f0e5487c143cebc1505b33f80685355f72494d34f2fc"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.385072 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.390429 4763 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pr9h9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.390473 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" podUID="b02ef795-1c7c-49fe-bd12-00043942b97f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.399628 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bncjz\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-kube-api-access-bncjz\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.404497 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n"] Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.404720 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" event={"ID":"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb","Type":"ContainerStarted","Data":"970b89bc85b68e699cfa7e6c3f067248d3fb2fa427a5045cffa54196068b9222"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.404748 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" event={"ID":"b7fa29f6-d0eb-4e4b-a2f7-7ab6d284ecfb","Type":"ContainerStarted","Data":"f8a93b38cb1740c799a244443a0a064832f0a3442b3b4d8effa8839a4d4dcc0d"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.414141 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" event={"ID":"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a","Type":"ContainerStarted","Data":"0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.418707 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.424573 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.426055 4763 generic.go:334] "Generic (PLEG): container finished" podID="8bb70296-8159-45af-9860-f681e6e8af61" containerID="ee1d1515e47eb0ac03b765ad2ab67fcbd07587ecaad3cdf1bbe9412c81d8072d" exitCode=0 Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.426769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" event={"ID":"8bb70296-8159-45af-9860-f681e6e8af61","Type":"ContainerDied","Data":"ee1d1515e47eb0ac03b765ad2ab67fcbd07587ecaad3cdf1bbe9412c81d8072d"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.428836 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.432545 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-7n989 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.432598 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7n989" podUID="9ab5712d-bdff-43e3-b9ab-26d452c17259" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.432850 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" event={"ID":"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430","Type":"ContainerStarted","Data":"1368b43bb21180aeababd069810d427e3ee543d061c46920968e9f52988e43ff"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.432899 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" event={"ID":"bd26b8e0-dc7c-4d93-bb0f-3bf9de025430","Type":"ContainerStarted","Data":"6b95ec40e4278510d58e78d9a6aea4102858f2da3e7515647a62a264654ae182"} Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.439125 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw6rp\" (UniqueName: \"kubernetes.io/projected/db943839-6479-48dd-baa3-e23662c89494-kube-api-access-xw6rp\") pod \"cluster-samples-operator-665b6dd947-6mbnl\" (UID: \"db943839-6479-48dd-baa3-e23662c89494\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.439747 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:49 crc kubenswrapper[4763]: E1006 14:55:49.441327 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:49.941313999 +0000 UTC m=+147.096606511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.441431 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.447761 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.464641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxf9\" (UniqueName: \"kubernetes.io/projected/40559ba2-0471-4c13-9c3d-4b3bc6d60739-kube-api-access-8kxf9\") pod \"ingress-canary-p8lxt\" (UID: \"40559ba2-0471-4c13-9c3d-4b3bc6d60739\") " pod="openshift-ingress-canary/ingress-canary-p8lxt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.468230 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ql9w\" (UniqueName: \"kubernetes.io/projected/519076e5-c5f8-4122-aa97-7941d40204dc-kube-api-access-6ql9w\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8t2s\" (UID: \"519076e5-c5f8-4122-aa97-7941d40204dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.472929 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.485271 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnb4x\" (UniqueName: \"kubernetes.io/projected/13deabc5-432a-4064-9d09-e17b6d11701a-kube-api-access-dnb4x\") pod \"kube-storage-version-migrator-operator-b67b599dd-t8gg4\" (UID: \"13deabc5-432a-4064-9d09-e17b6d11701a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.496482 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w72px"] Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.502561 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.505376 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86h72\" (UniqueName: \"kubernetes.io/projected/3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3-kube-api-access-86h72\") pod \"machine-config-operator-74547568cd-z6p46\" (UID: \"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.526557 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxfqd\" (UniqueName: \"kubernetes.io/projected/4866322c-ee16-49bd-a037-17189761a083-kube-api-access-dxfqd\") pod \"machine-config-controller-84d6567774-zphtt\" (UID: \"4866322c-ee16-49bd-a037-17189761a083\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.528090 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.541571 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: E1006 14:55:49.542566 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:50.042552194 +0000 UTC m=+147.197844776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.544988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6498w\" (UniqueName: \"kubernetes.io/projected/37cd70e6-8b4c-443e-a91d-a76463388fb7-kube-api-access-6498w\") pod \"olm-operator-6b444d44fb-cfn9t\" (UID: \"37cd70e6-8b4c-443e-a91d-a76463388fb7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.547179 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.562771 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.564046 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75xgn\" (UniqueName: \"kubernetes.io/projected/ef7f0e05-29c2-4890-8d13-0466593e1fa8-kube-api-access-75xgn\") pod \"marketplace-operator-79b997595-zmrz8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.586986 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcwxm\" (UniqueName: \"kubernetes.io/projected/16d0552a-c067-45c4-a3c4-93ab3a2455ec-kube-api-access-mcwxm\") pod \"package-server-manager-789f6589d5-d9hfk\" (UID: \"16d0552a-c067-45c4-a3c4-93ab3a2455ec\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.592705 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.607536 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.609221 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdd5j\" (UniqueName: \"kubernetes.io/projected/8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c-kube-api-access-rdd5j\") pod \"catalog-operator-68c6474976-6b2lk\" (UID: \"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.624493 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.628988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t24zm\" (UniqueName: \"kubernetes.io/projected/81102bbe-a463-45e8-9fb2-dd27f0756db8-kube-api-access-t24zm\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.642535 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:49 crc kubenswrapper[4763]: E1006 14:55:49.642967 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:50.142953115 +0000 UTC m=+147.298245617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.654888 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.657544 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slcz4\" (UniqueName: \"kubernetes.io/projected/6b243bd7-367c-41e7-9101-981ed6d10a13-kube-api-access-slcz4\") pod \"collect-profiles-29329365-nptf5\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.664774 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.667545 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p8lxt" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.677094 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxtv2\" (UniqueName: \"kubernetes.io/projected/1196c738-baee-48da-b414-986e510b81c2-kube-api-access-jxtv2\") pod \"service-ca-9c57cc56f-vqpmv\" (UID: \"1196c738-baee-48da-b414-986e510b81c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.701883 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb2gv\" (UniqueName: \"kubernetes.io/projected/3da11f04-62a5-45e7-9c88-4d9a39fff2d3-kube-api-access-fb2gv\") pod \"machine-config-server-mbhss\" (UID: \"3da11f04-62a5-45e7-9c88-4d9a39fff2d3\") " pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.702353 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4"] Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.707060 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68r67\" (UniqueName: \"kubernetes.io/projected/f551caab-0911-4a99-b78f-b6c9dd198d2e-kube-api-access-68r67\") pod \"multus-admission-controller-857f4d67dd-hksqs\" (UID: \"f551caab-0911-4a99-b78f-b6c9dd198d2e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.722470 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00fdffba-67a3-4bfa-a996-c13403e0631a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8sn9\" (UID: \"00fdffba-67a3-4bfa-a996-c13403e0631a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.744452 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: E1006 14:55:49.744794 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:50.244781278 +0000 UTC m=+147.400073780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.753899 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-t2f4r"] Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.757277 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81102bbe-a463-45e8-9fb2-dd27f0756db8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mljxf\" (UID: \"81102bbe-a463-45e8-9fb2-dd27f0756db8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.768732 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4598\" (UniqueName: \"kubernetes.io/projected/f421ec41-8fa4-4d6f-a018-75409ad7dd84-kube-api-access-m4598\") pod \"migrator-59844c95c7-xffns\" (UID: \"f421ec41-8fa4-4d6f-a018-75409ad7dd84\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.802274 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7"] Oct 06 14:55:49 crc kubenswrapper[4763]: W1006 14:55:49.806589 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc281a39_2f6b_407d_a27e_0d19025186d7.slice/crio-8d69643ee0562150fc8e8ea83829bf2dfeba0445db944f83458f1642412473d1 WatchSource:0}: Error finding container 8d69643ee0562150fc8e8ea83829bf2dfeba0445db944f83458f1642412473d1: Status 404 returned error can't find the container with id 8d69643ee0562150fc8e8ea83829bf2dfeba0445db944f83458f1642412473d1 Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.812166 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz5df\" (UniqueName: \"kubernetes.io/projected/4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13-kube-api-access-nz5df\") pod \"router-default-5444994796-rtxf5\" (UID: \"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13\") " pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.812352 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29glw\" (UniqueName: \"kubernetes.io/projected/a068e2ad-5184-4bdd-9585-10a5451a7c3f-kube-api-access-29glw\") pod \"dns-default-9h5w7\" (UID: \"a068e2ad-5184-4bdd-9585-10a5451a7c3f\") " pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.818821 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.833051 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.841074 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.842309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59b2923d-9f63-412c-8f41-5f92f9258163-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6sqtv\" (UID: \"59b2923d-9f63-412c-8f41-5f92f9258163\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.846223 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:49 crc kubenswrapper[4763]: E1006 14:55:49.846486 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:50.346460806 +0000 UTC m=+147.501753308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.849760 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxglk\" (UniqueName: \"kubernetes.io/projected/e684b841-47ba-4714-aca1-2fc94976ea11-kube-api-access-hxglk\") pod \"packageserver-d55dfcdfc-dq69v\" (UID: \"e684b841-47ba-4714-aca1-2fc94976ea11\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.854895 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.871416 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqvfq\" (UniqueName: \"kubernetes.io/projected/d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8-kube-api-access-nqvfq\") pod \"dns-operator-744455d44c-82t6f\" (UID: \"d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8\") " pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.878931 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.887141 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.894556 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5jbm\" (UniqueName: \"kubernetes.io/projected/b01c4848-8d37-4d3d-928c-c627cb4f3890-kube-api-access-b5jbm\") pod \"csi-hostpathplugin-qx94g\" (UID: \"b01c4848-8d37-4d3d-928c-c627cb4f3890\") " pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.922036 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.923463 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4"] Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.929558 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xmdg\" (UniqueName: \"kubernetes.io/projected/f98a04d2-8411-4a25-b4ae-030305025e74-kube-api-access-8xmdg\") pod \"service-ca-operator-777779d784-7zwvq\" (UID: \"f98a04d2-8411-4a25-b4ae-030305025e74\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.931327 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mbhss" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.938977 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.947960 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:49 crc kubenswrapper[4763]: E1006 14:55:49.948223 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:50.448212396 +0000 UTC m=+147.603504908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.959903 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.996671 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:49 crc kubenswrapper[4763]: I1006 14:55:49.998648 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qx94g" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.057345 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:50 crc kubenswrapper[4763]: E1006 14:55:50.058087 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:50.558072502 +0000 UTC m=+147.713365014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.117383 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.167846 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:50 crc kubenswrapper[4763]: E1006 14:55:50.168237 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:50.668221617 +0000 UTC m=+147.823514129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.175949 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.198860 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46"] Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.199098 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.202269 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k"] Oct 06 14:55:50 crc kubenswrapper[4763]: W1006 14:55:50.267730 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6ac57b5_83eb_472e_8dba_1806968a91bf.slice/crio-b9f73781f6e743ee7e40a5b32a6186125223cd0544a055a9f487d590af994c78 WatchSource:0}: Error finding container b9f73781f6e743ee7e40a5b32a6186125223cd0544a055a9f487d590af994c78: Status 404 returned error can't find the container with id b9f73781f6e743ee7e40a5b32a6186125223cd0544a055a9f487d590af994c78 Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.268693 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:50 crc kubenswrapper[4763]: E1006 14:55:50.269010 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:50.768996929 +0000 UTC m=+147.924289441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:50 crc kubenswrapper[4763]: W1006 14:55:50.349115 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e08bdc2_eb75_4e53_9ec9_2c9ac7982cb3.slice/crio-8e11284399e77648984c463a11b5f91869bcb33d1d835a14c8f66ed7676a9583 WatchSource:0}: Error finding container 8e11284399e77648984c463a11b5f91869bcb33d1d835a14c8f66ed7676a9583: Status 404 returned error can't find the container with id 8e11284399e77648984c463a11b5f91869bcb33d1d835a14c8f66ed7676a9583 Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.371337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:50 crc kubenswrapper[4763]: E1006 14:55:50.371707 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:50.871691307 +0000 UTC m=+148.026983819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.450841 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qvml8"] Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.473169 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:50 crc kubenswrapper[4763]: E1006 14:55:50.473797 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:50.973767676 +0000 UTC m=+148.129060198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.486735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" event={"ID":"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d","Type":"ContainerStarted","Data":"2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.487649 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.489289 4763 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9nc4c container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" start-of-body= Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.489322 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" podUID="8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.491171 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" event={"ID":"13deabc5-432a-4064-9d09-e17b6d11701a","Type":"ContainerStarted","Data":"93ea61e72ac4af0b647e0e8f04928785dcbba427bb1c6db8ac8d027d8a1113f1"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.502472 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-csnzk" podStartSLOduration=126.502458761 podStartE2EDuration="2m6.502458761s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:50.5013749 +0000 UTC m=+147.656667412" watchObservedRunningTime="2025-10-06 14:55:50.502458761 +0000 UTC m=+147.657751273" Oct 06 14:55:50 crc kubenswrapper[4763]: W1006 14:55:50.508507 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3da11f04_62a5_45e7_9c88_4d9a39fff2d3.slice/crio-f9220d878961db988a9c09a11e7884b0fb26edded655424b3e8d161f0e932e22 WatchSource:0}: Error finding container f9220d878961db988a9c09a11e7884b0fb26edded655424b3e8d161f0e932e22: Status 404 returned error can't find the container with id f9220d878961db988a9c09a11e7884b0fb26edded655424b3e8d161f0e932e22 Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.508983 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" event={"ID":"7932c1b6-6f13-4b40-a720-2753116df818","Type":"ContainerStarted","Data":"eb62f8efb45fb1e5107812c1027165c00e0da7533c7fb426174d277505ae6433"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.568456 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s"] Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.577099 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:50 crc kubenswrapper[4763]: E1006 14:55:50.577594 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:51.077583197 +0000 UTC m=+148.232875709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.606041 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" event={"ID":"6ebc102c-269a-456a-aacb-fc15bfac28f1","Type":"ContainerStarted","Data":"5ee5a65f43a2fa27424dc41246882cbe14b3cce996e69f6fa669e8fbd69c0e22"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.606085 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" event={"ID":"6ebc102c-269a-456a-aacb-fc15bfac28f1","Type":"ContainerStarted","Data":"977534179fcfb0694de69276c31db343ff5449f1924002d5f3af8cd95ae63cb9"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.611344 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rtxf5" event={"ID":"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13","Type":"ContainerStarted","Data":"7db06279a6ed1879de34ee67470c3cd07804dedf2cfe095dbcaa17a1afbe8c65"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.615268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" event={"ID":"3bfc8a48-e756-488e-a468-3f328ca87a48","Type":"ContainerStarted","Data":"db4348586414331a9b3a883c0a4acf590fcb3298eadef393ad5a1c17c1ff9705"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.615828 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.620554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" event={"ID":"b6ac57b5-83eb-472e-8dba-1806968a91bf","Type":"ContainerStarted","Data":"b9f73781f6e743ee7e40a5b32a6186125223cd0544a055a9f487d590af994c78"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.650018 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t2f4r" event={"ID":"fc281a39-2f6b-407d-a27e-0d19025186d7","Type":"ContainerStarted","Data":"8d69643ee0562150fc8e8ea83829bf2dfeba0445db944f83458f1642412473d1"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.667010 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" event={"ID":"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3","Type":"ContainerStarted","Data":"8e11284399e77648984c463a11b5f91869bcb33d1d835a14c8f66ed7676a9583"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.691708 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:50 crc kubenswrapper[4763]: E1006 14:55:50.692963 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:51.192949183 +0000 UTC m=+148.348241685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.696876 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" event={"ID":"16763adf-2c76-4b80-833f-39219d9e279e","Type":"ContainerStarted","Data":"9a8c4d068193ad10200e88942af665e45245b0e95bb1222d2e6dd9aea80e545f"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.699882 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-w72px" event={"ID":"1dff0313-c10e-40ca-b794-ab8e85c08c8b","Type":"ContainerStarted","Data":"22889abf8e90d3b5962109482b1dced8719af9a258d26a22fb59c00ccea0343d"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.699907 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-w72px" event={"ID":"1dff0313-c10e-40ca-b794-ab8e85c08c8b","Type":"ContainerStarted","Data":"b1c8ce1ccd41d97a2890b6cbd51fec32ccc0e4ede5a34e59aaa9063191578207"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.700369 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.708954 4763 patch_prober.go:28] interesting pod/console-operator-58897d9998-w72px container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.709002 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-w72px" podUID="1dff0313-c10e-40ca-b794-ab8e85c08c8b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.723061 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt"] Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.744950 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" event={"ID":"8bb70296-8159-45af-9860-f681e6e8af61","Type":"ContainerStarted","Data":"71e68b602326b61a574429e70a03b31a5fbf164835960ba367744c6177a54f95"} Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.746337 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-7n989 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.746382 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7n989" podUID="9ab5712d-bdff-43e3-b9ab-26d452c17259" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.754585 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t"] Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.768795 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk"] Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.769471 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.789754 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk"] Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.793060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:50 crc kubenswrapper[4763]: E1006 14:55:50.798015 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:51.298001139 +0000 UTC m=+148.453293741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.819092 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-phqz6" podStartSLOduration=126.819077523 podStartE2EDuration="2m6.819077523s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:50.814886201 +0000 UTC m=+147.970178713" watchObservedRunningTime="2025-10-06 14:55:50.819077523 +0000 UTC m=+147.974370035" Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.820111 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5"] Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.894750 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:50 crc kubenswrapper[4763]: E1006 14:55:50.895017 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:51.394996921 +0000 UTC m=+148.550289433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.896659 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:50 crc kubenswrapper[4763]: E1006 14:55:50.907769 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:51.407756383 +0000 UTC m=+148.563048895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.940685 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p8lxt"] Oct 06 14:55:50 crc kubenswrapper[4763]: I1006 14:55:50.941187 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vqpmv"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:50.999092 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:51 crc kubenswrapper[4763]: E1006 14:55:50.999426 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:51.499409329 +0000 UTC m=+148.654701841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.100839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:51 crc kubenswrapper[4763]: E1006 14:55:51.101412 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:51.601401206 +0000 UTC m=+148.756693718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.147643 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.149049 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl"] Oct 06 14:55:51 crc kubenswrapper[4763]: W1006 14:55:51.153869 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b243bd7_367c_41e7_9101_981ed6d10a13.slice/crio-bc57257bae14c9312fbf717cfea1ada5eed0e26f48ae45292419f96e85afa23b WatchSource:0}: Error finding container bc57257bae14c9312fbf717cfea1ada5eed0e26f48ae45292419f96e85afa23b: Status 404 returned error can't find the container with id bc57257bae14c9312fbf717cfea1ada5eed0e26f48ae45292419f96e85afa23b Oct 06 14:55:51 crc kubenswrapper[4763]: W1006 14:55:51.189493 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40559ba2_0471_4c13_9c3d_4b3bc6d60739.slice/crio-e7ba8fb25daddeb22bedba4d176077bc2bc306d5f95a28c83858ce5fe74a8d58 WatchSource:0}: Error finding container e7ba8fb25daddeb22bedba4d176077bc2bc306d5f95a28c83858ce5fe74a8d58: Status 404 returned error can't find the container with id e7ba8fb25daddeb22bedba4d176077bc2bc306d5f95a28c83858ce5fe74a8d58 Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.203473 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:51 crc kubenswrapper[4763]: E1006 14:55:51.203809 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:51.703770955 +0000 UTC m=+148.859063467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.244396 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qr4jz" podStartSLOduration=127.244382116 podStartE2EDuration="2m7.244382116s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:51.244203551 +0000 UTC m=+148.399496063" watchObservedRunningTime="2025-10-06 14:55:51.244382116 +0000 UTC m=+148.399674618" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.309445 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:51 crc kubenswrapper[4763]: E1006 14:55:51.309947 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:51.809934383 +0000 UTC m=+148.965226895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.378944 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-wqpfm" podStartSLOduration=126.378930991 podStartE2EDuration="2m6.378930991s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:51.32907606 +0000 UTC m=+148.484368572" watchObservedRunningTime="2025-10-06 14:55:51.378930991 +0000 UTC m=+148.534223503" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.411069 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:51 crc kubenswrapper[4763]: E1006 14:55:51.411370 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:51.911354824 +0000 UTC m=+149.066647336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.465332 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-7n989" podStartSLOduration=126.465309464 podStartE2EDuration="2m6.465309464s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:51.457060254 +0000 UTC m=+148.612352766" watchObservedRunningTime="2025-10-06 14:55:51.465309464 +0000 UTC m=+148.620601976" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.474366 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmrz8"] Oct 06 14:55:51 crc kubenswrapper[4763]: E1006 14:55:51.513804 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.013791544 +0000 UTC m=+149.169084056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.513834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.558277 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.571774 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.615725 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:51 crc kubenswrapper[4763]: E1006 14:55:51.616024 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.116010438 +0000 UTC m=+149.271302950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.630786 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" podStartSLOduration=126.630764807 podStartE2EDuration="2m6.630764807s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:51.622879208 +0000 UTC m=+148.778171720" watchObservedRunningTime="2025-10-06 14:55:51.630764807 +0000 UTC m=+148.786057319" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.721818 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:51 crc kubenswrapper[4763]: E1006 14:55:51.722127 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.222115755 +0000 UTC m=+149.377408267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.756587 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" podStartSLOduration=126.756570667 podStartE2EDuration="2m6.756570667s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:51.721000163 +0000 UTC m=+148.876292675" watchObservedRunningTime="2025-10-06 14:55:51.756570667 +0000 UTC m=+148.911863179" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.761227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" event={"ID":"6b243bd7-367c-41e7-9101-981ed6d10a13","Type":"ContainerStarted","Data":"bc57257bae14c9312fbf717cfea1ada5eed0e26f48ae45292419f96e85afa23b"} Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.777861 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" event={"ID":"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c","Type":"ContainerStarted","Data":"e93c9e33d40307d7e82183fe8c31aac557a2596c27dfb18723e32c8978e78053"} Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.779247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" event={"ID":"81102bbe-a463-45e8-9fb2-dd27f0756db8","Type":"ContainerStarted","Data":"e2c70cadcfc655dbc29641bd6b80daf8d80c7b7a9e64ef8a7b0ee1518b65c3e7"} Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.785836 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hksqs"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.791821 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-82t6f"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.797540 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nrz2n" podStartSLOduration=126.797525479 podStartE2EDuration="2m6.797525479s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:51.794267634 +0000 UTC m=+148.949560136" watchObservedRunningTime="2025-10-06 14:55:51.797525479 +0000 UTC m=+148.952817991" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.798647 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.798678 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" event={"ID":"4866322c-ee16-49bd-a037-17189761a083","Type":"ContainerStarted","Data":"cdbb37b3067f37e301c55e7017480c2e037c88ab9d0a34e376827b267b40afa6"} Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.824419 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:51 crc kubenswrapper[4763]: W1006 14:55:51.824936 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf551caab_0911_4a99_b78f_b6c9dd198d2e.slice/crio-459da026fbead5a9bfe36fc2d33bfecc8832393f5dd31459a055fc49dfbe4a46 WatchSource:0}: Error finding container 459da026fbead5a9bfe36fc2d33bfecc8832393f5dd31459a055fc49dfbe4a46: Status 404 returned error can't find the container with id 459da026fbead5a9bfe36fc2d33bfecc8832393f5dd31459a055fc49dfbe4a46 Oct 06 14:55:51 crc kubenswrapper[4763]: E1006 14:55:51.825911 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.325883014 +0000 UTC m=+149.481175526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.829358 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9h5w7"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.856203 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" event={"ID":"8bb70296-8159-45af-9860-f681e6e8af61","Type":"ContainerStarted","Data":"ae796c1ae4eabcffe3b8c6e761c9b3cc7f8f306ba571cf7ec997ebc2a78db691"} Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.860395 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" podStartSLOduration=127.860372897 podStartE2EDuration="2m7.860372897s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:51.84980299 +0000 UTC m=+149.005095512" watchObservedRunningTime="2025-10-06 14:55:51.860372897 +0000 UTC m=+149.015665409" Oct 06 14:55:51 crc kubenswrapper[4763]: W1006 14:55:51.866700 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf98a04d2_8411_4a25_b4ae_030305025e74.slice/crio-98f63b1ab65c5d53e73621e3477cb73b448c0decd4e0a31f7fbc0be07d6a06ed WatchSource:0}: Error finding container 98f63b1ab65c5d53e73621e3477cb73b448c0decd4e0a31f7fbc0be07d6a06ed: Status 404 returned error can't find the container with id 98f63b1ab65c5d53e73621e3477cb73b448c0decd4e0a31f7fbc0be07d6a06ed Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.871490 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qx94g"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.874746 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" event={"ID":"ef7f0e05-29c2-4890-8d13-0466593e1fa8","Type":"ContainerStarted","Data":"3a58529215a0e7e18c4b1c5e457a511d4c153654088068d85d9f104a702dbcb0"} Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.880303 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.888270 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv"] Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.895820 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rtxf5" event={"ID":"4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13","Type":"ContainerStarted","Data":"4b2007f0fdb8828cc149703e147983e5c540686ad7b0e55b59607d68d4b74a8a"} Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.906042 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" event={"ID":"00fdffba-67a3-4bfa-a996-c13403e0631a","Type":"ContainerStarted","Data":"d74905da6c049f41e8966abf627129a81f3e2bbb3f80536cb0c89375964e827c"} Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.920375 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-w72px" podStartSLOduration=127.920359973 podStartE2EDuration="2m7.920359973s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:51.898724533 +0000 UTC m=+149.054017045" watchObservedRunningTime="2025-10-06 14:55:51.920359973 +0000 UTC m=+149.075652485" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.920488 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" podStartSLOduration=126.920483446 podStartE2EDuration="2m6.920483446s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:51.9192286 +0000 UTC m=+149.074521112" watchObservedRunningTime="2025-10-06 14:55:51.920483446 +0000 UTC m=+149.075775958" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.932226 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:51 crc kubenswrapper[4763]: E1006 14:55:51.933812 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.433799014 +0000 UTC m=+149.589091526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.934005 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.936184 4763 patch_prober.go:28] interesting pod/router-default-5444994796-rtxf5 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.936487 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtxf5" podUID="4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.936438 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t2f4r" event={"ID":"fc281a39-2f6b-407d-a27e-0d19025186d7","Type":"ContainerStarted","Data":"4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f"} Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.983048 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" event={"ID":"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3","Type":"ContainerStarted","Data":"a4632ff6d6271be861b900053893967963e4af247575fa642d184ca12cc5598e"} Oct 06 14:55:51 crc kubenswrapper[4763]: I1006 14:55:51.996136 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5zr7" event={"ID":"16763adf-2c76-4b80-833f-39219d9e279e","Type":"ContainerStarted","Data":"91eb13f51ea3acc3cf76d59b5edcbcf2e113518469ce0dcb38d4ea6a4ce067cc"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.001013 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" podStartSLOduration=128.000995899 podStartE2EDuration="2m8.000995899s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:51.995965632 +0000 UTC m=+149.151258144" watchObservedRunningTime="2025-10-06 14:55:52.000995899 +0000 UTC m=+149.156288411" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.036779 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mbhss" event={"ID":"3da11f04-62a5-45e7-9c88-4d9a39fff2d3","Type":"ContainerStarted","Data":"522749e0279e5c489b878bcdef99306274a3091ec8eea8f783e819b563966ab2"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.036826 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mbhss" event={"ID":"3da11f04-62a5-45e7-9c88-4d9a39fff2d3","Type":"ContainerStarted","Data":"f9220d878961db988a9c09a11e7884b0fb26edded655424b3e8d161f0e932e22"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.037869 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.039342 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.539315963 +0000 UTC m=+149.694608545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.063452 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" event={"ID":"519076e5-c5f8-4122-aa97-7941d40204dc","Type":"ContainerStarted","Data":"16280477e13f00aa8b2f800e98426d5a900a78851379eb4954374a97cccb3a65"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.075139 4763 generic.go:334] "Generic (PLEG): container finished" podID="7932c1b6-6f13-4b40-a720-2753116df818" containerID="6d6aa5f652bc67154e3f093e5aca9d81b54def031106fa07b4b52143dccf6037" exitCode=0 Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.075201 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" event={"ID":"7932c1b6-6f13-4b40-a720-2753116df818","Type":"ContainerDied","Data":"6d6aa5f652bc67154e3f093e5aca9d81b54def031106fa07b4b52143dccf6037"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.083110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p8lxt" event={"ID":"40559ba2-0471-4c13-9c3d-4b3bc6d60739","Type":"ContainerStarted","Data":"e7ba8fb25daddeb22bedba4d176077bc2bc306d5f95a28c83858ce5fe74a8d58"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.089441 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-t2f4r" podStartSLOduration=127.089418871 podStartE2EDuration="2m7.089418871s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:52.062073946 +0000 UTC m=+149.217366458" watchObservedRunningTime="2025-10-06 14:55:52.089418871 +0000 UTC m=+149.244711383" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.091177 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-rtxf5" podStartSLOduration=127.091172832 podStartE2EDuration="2m7.091172832s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:52.087939158 +0000 UTC m=+149.243231670" watchObservedRunningTime="2025-10-06 14:55:52.091172832 +0000 UTC m=+149.246465344" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.098315 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" event={"ID":"1196c738-baee-48da-b414-986e510b81c2","Type":"ContainerStarted","Data":"d1ac6cee392acdb79102ac69b0c32f33fb00de21a2f8ecbd808ff41d3dae1656"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.119168 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" event={"ID":"0420693a-911e-43d1-830f-cad488328368","Type":"ContainerStarted","Data":"b95f50ec51cec9f877e8c8de0c9b5c04ede6fe8a8cbe16ebd999f0a54767df44"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.122471 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" event={"ID":"13deabc5-432a-4064-9d09-e17b6d11701a","Type":"ContainerStarted","Data":"24bc676ab95d6e851d885e76b3463cefc0e50ec7143eab4d0a5418e088fe9685"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.132581 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" event={"ID":"37cd70e6-8b4c-443e-a91d-a76463388fb7","Type":"ContainerStarted","Data":"18941719e3e4c1e0638064fb3f42f189c4ef33d931e2d42cdb6c6786baefcee6"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.133550 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.134360 4763 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-cfn9t container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.134393 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" podUID="37cd70e6-8b4c-443e-a91d-a76463388fb7" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.138677 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" event={"ID":"16d0552a-c067-45c4-a3c4-93ab3a2455ec","Type":"ContainerStarted","Data":"3538f2b84462bc756748d3916e2bb1e20d0718d55e6b2bf588f82265778cfb00"} Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.141608 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mbhss" podStartSLOduration=6.141590739 podStartE2EDuration="6.141590739s" podCreationTimestamp="2025-10-06 14:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:52.112384859 +0000 UTC m=+149.267677381" watchObservedRunningTime="2025-10-06 14:55:52.141590739 +0000 UTC m=+149.296883251" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.141765 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.144345 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.644334749 +0000 UTC m=+149.799627261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.175007 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w65lk" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.175203 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.177602 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" podStartSLOduration=128.177594346 podStartE2EDuration="2m8.177594346s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:52.177080031 +0000 UTC m=+149.332372543" watchObservedRunningTime="2025-10-06 14:55:52.177594346 +0000 UTC m=+149.332886858" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.248594 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.250508 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.750485137 +0000 UTC m=+149.905777649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.268367 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-t8gg4" podStartSLOduration=127.268350207 podStartE2EDuration="2m7.268350207s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:52.235650395 +0000 UTC m=+149.390942907" watchObservedRunningTime="2025-10-06 14:55:52.268350207 +0000 UTC m=+149.423642719" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.269297 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" podStartSLOduration=127.269290094 podStartE2EDuration="2m7.269290094s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:52.204111408 +0000 UTC m=+149.359403920" watchObservedRunningTime="2025-10-06 14:55:52.269290094 +0000 UTC m=+149.424582606" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.284463 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" podStartSLOduration=127.284447615 podStartE2EDuration="2m7.284447615s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:52.26776953 +0000 UTC m=+149.423062042" watchObservedRunningTime="2025-10-06 14:55:52.284447615 +0000 UTC m=+149.439740127" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.361425 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.361729 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.861717992 +0000 UTC m=+150.017010504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.379910 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" podStartSLOduration=127.379895801 podStartE2EDuration="2m7.379895801s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:52.340494045 +0000 UTC m=+149.495786557" watchObservedRunningTime="2025-10-06 14:55:52.379895801 +0000 UTC m=+149.535188313" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.417856 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p8lxt" podStartSLOduration=6.417840245 podStartE2EDuration="6.417840245s" podCreationTimestamp="2025-10-06 14:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:52.381637182 +0000 UTC m=+149.536929704" watchObservedRunningTime="2025-10-06 14:55:52.417840245 +0000 UTC m=+149.573132757" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.462166 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.462746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.463030 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.463131 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.469211 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:52.969190789 +0000 UTC m=+150.124483301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.481487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.493719 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.495147 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.509857 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.571728 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.571786 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.575970 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.075955485 +0000 UTC m=+150.231247997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.582208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.601944 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.612040 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.619748 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-w72px" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.673540 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.673923 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.173904185 +0000 UTC m=+150.329196697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.778796 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.779144 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.779238 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.779530 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.279518717 +0000 UTC m=+150.434811229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.880199 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.880497 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.380470655 +0000 UTC m=+150.535763167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.881439 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.881750 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.381739321 +0000 UTC m=+150.537031833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.929029 4763 patch_prober.go:28] interesting pod/router-default-5444994796-rtxf5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 14:55:52 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Oct 06 14:55:52 crc kubenswrapper[4763]: [+]process-running ok Oct 06 14:55:52 crc kubenswrapper[4763]: healthz check failed Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.929077 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtxf5" podUID="4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 14:55:52 crc kubenswrapper[4763]: I1006 14:55:52.982285 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:52 crc kubenswrapper[4763]: E1006 14:55:52.982681 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.482664928 +0000 UTC m=+150.637957440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.085886 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:53 crc kubenswrapper[4763]: E1006 14:55:53.086499 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.586484268 +0000 UTC m=+150.741776780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.187094 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:53 crc kubenswrapper[4763]: E1006 14:55:53.187603 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.687585319 +0000 UTC m=+150.842877831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.202635 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" event={"ID":"ef7f0e05-29c2-4890-8d13-0466593e1fa8","Type":"ContainerStarted","Data":"c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.218376 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vqpmv" event={"ID":"1196c738-baee-48da-b414-986e510b81c2","Type":"ContainerStarted","Data":"f2bb21541bc0dc40c620fece8cfd6771e19d256239070f2538d906276b1585ef"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.240819 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" podStartSLOduration=128.240806938 podStartE2EDuration="2m8.240806938s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:53.238758288 +0000 UTC m=+150.394050800" watchObservedRunningTime="2025-10-06 14:55:53.240806938 +0000 UTC m=+150.396099450" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.260464 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9h5w7" event={"ID":"a068e2ad-5184-4bdd-9585-10a5451a7c3f","Type":"ContainerStarted","Data":"99591028987c6069d6112c6bde8f2c08555120ed508818e64e9b44c3cd77b0f2"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.287511 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" event={"ID":"3e08bdc2-eb75-4e53-9ec9-2c9ac7982cb3","Type":"ContainerStarted","Data":"ffd7327ce6f4eefab4220938c0749355deef0003bef34f80e90539ae86c3ba05"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.288337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:53 crc kubenswrapper[4763]: E1006 14:55:53.289256 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.789244807 +0000 UTC m=+150.944537319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.297590 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" event={"ID":"e684b841-47ba-4714-aca1-2fc94976ea11","Type":"ContainerStarted","Data":"0142210bf0eb5c8deaab2c2a5feb6e4b658ae3b99942f0ecb83a6da64da578f1"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.297861 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" event={"ID":"e684b841-47ba-4714-aca1-2fc94976ea11","Type":"ContainerStarted","Data":"b35626443ad0fa311deea6018ae5bf217235ce76dfef883b78f9355ca06de094"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.297877 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.306739 4763 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dq69v container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.306783 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" podUID="e684b841-47ba-4714-aca1-2fc94976ea11" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.311809 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" event={"ID":"db943839-6479-48dd-baa3-e23662c89494","Type":"ContainerStarted","Data":"d72e877b65f4190dbd2320d2e75a0a69a768ba5f9ccf511a9b7903e57738cb77"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.311860 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" event={"ID":"db943839-6479-48dd-baa3-e23662c89494","Type":"ContainerStarted","Data":"e581a7965431356ab307c0217c3c48f4cfb0c9e3a996ce998c4a3dddd04ae1d4"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.319128 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" event={"ID":"16d0552a-c067-45c4-a3c4-93ab3a2455ec","Type":"ContainerStarted","Data":"fe2d34c265c4d935da0addd82857e47ecc65da3b00fc3b306d232f6572c3d709"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.319178 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" event={"ID":"16d0552a-c067-45c4-a3c4-93ab3a2455ec","Type":"ContainerStarted","Data":"b07c64c87b6536dce95c0e6a1f7bfcd75acc5dd3dbac8f9eb918d701006a7fbf"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.319833 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.369327 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" podStartSLOduration=128.369310796 podStartE2EDuration="2m8.369310796s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:53.367050151 +0000 UTC m=+150.522342663" watchObservedRunningTime="2025-10-06 14:55:53.369310796 +0000 UTC m=+150.524603308" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.369473 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z6p46" podStartSLOduration=128.369467821 podStartE2EDuration="2m8.369467821s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:53.335867003 +0000 UTC m=+150.491159525" watchObservedRunningTime="2025-10-06 14:55:53.369467821 +0000 UTC m=+150.524760333" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.374780 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns" event={"ID":"f421ec41-8fa4-4d6f-a018-75409ad7dd84","Type":"ContainerStarted","Data":"61aa01e3ecf66a43fd2b687a88e7049cba456522f32f29a93fca4b496455a1f4"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.374807 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns" event={"ID":"f421ec41-8fa4-4d6f-a018-75409ad7dd84","Type":"ContainerStarted","Data":"9375b00430822beeed847b52f9285d4e0a51c193ebb860f41a0e2c2a7dfa1293"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.389400 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:53 crc kubenswrapper[4763]: E1006 14:55:53.390360 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.890345989 +0000 UTC m=+151.045638501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.402487 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" event={"ID":"37cd70e6-8b4c-443e-a91d-a76463388fb7","Type":"ContainerStarted","Data":"0f05e513c0a63b3170523f5a987306897415757c7cc89682d3ced4becf31514d"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.406629 4763 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-cfn9t container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.406668 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" podUID="37cd70e6-8b4c-443e-a91d-a76463388fb7" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.425808 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" podStartSLOduration=129.4257906 podStartE2EDuration="2m9.4257906s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:53.394974303 +0000 UTC m=+150.550266815" watchObservedRunningTime="2025-10-06 14:55:53.4257906 +0000 UTC m=+150.581083112" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.426934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p8lxt" event={"ID":"40559ba2-0471-4c13-9c3d-4b3bc6d60739","Type":"ContainerStarted","Data":"4c2323eef7a2f439c37ec03027e327c69aeefada4ad8b9566ef63c0f6884f2da"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.468953 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" event={"ID":"6b243bd7-367c-41e7-9101-981ed6d10a13","Type":"ContainerStarted","Data":"f8e75e84c9eadf077b260ee87240d74c4238eb9a17823964c806038b34446ae8"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.471925 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" event={"ID":"59b2923d-9f63-412c-8f41-5f92f9258163","Type":"ContainerStarted","Data":"2c7ad5afa4d4423e503b015ac56cc3126c717c6ef5d063980da31dc855d05a10"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.480530 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" podStartSLOduration=128.480516212 podStartE2EDuration="2m8.480516212s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:53.442953889 +0000 UTC m=+150.598246401" watchObservedRunningTime="2025-10-06 14:55:53.480516212 +0000 UTC m=+150.635808724" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.482734 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns" podStartSLOduration=128.482728526 podStartE2EDuration="2m8.482728526s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:53.479994547 +0000 UTC m=+150.635287059" watchObservedRunningTime="2025-10-06 14:55:53.482728526 +0000 UTC m=+150.638021038" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.494277 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:53 crc kubenswrapper[4763]: E1006 14:55:53.495704 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:53.995693623 +0000 UTC m=+151.150986135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.499963 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" event={"ID":"d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8","Type":"ContainerStarted","Data":"6e800d46bca768c3d42a9676a714e7f97dae9b825eb464c83a307d32e402566a"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.500011 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" event={"ID":"d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8","Type":"ContainerStarted","Data":"4cdbf29d54ceaf2a597485203dc3a969ffdf4deaba7316c395ef43f6761621ed"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.511364 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" podStartSLOduration=129.511348709 podStartE2EDuration="2m9.511348709s" podCreationTimestamp="2025-10-06 14:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:53.509918137 +0000 UTC m=+150.665210659" watchObservedRunningTime="2025-10-06 14:55:53.511348709 +0000 UTC m=+150.666641221" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.520331 4763 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xlb8f container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 06 14:55:53 crc kubenswrapper[4763]: [+]log ok Oct 06 14:55:53 crc kubenswrapper[4763]: [+]etcd ok Oct 06 14:55:53 crc kubenswrapper[4763]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 06 14:55:53 crc kubenswrapper[4763]: [+]poststarthook/generic-apiserver-start-informers ok Oct 06 14:55:53 crc kubenswrapper[4763]: [+]poststarthook/max-in-flight-filter ok Oct 06 14:55:53 crc kubenswrapper[4763]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 06 14:55:53 crc kubenswrapper[4763]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 06 14:55:53 crc kubenswrapper[4763]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 06 14:55:53 crc kubenswrapper[4763]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 06 14:55:53 crc kubenswrapper[4763]: [+]poststarthook/project.openshift.io-projectcache ok Oct 06 14:55:53 crc kubenswrapper[4763]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 06 14:55:53 crc kubenswrapper[4763]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Oct 06 14:55:53 crc kubenswrapper[4763]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 06 14:55:53 crc kubenswrapper[4763]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 06 14:55:53 crc kubenswrapper[4763]: livez check failed Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.520398 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" podUID="8bb70296-8159-45af-9860-f681e6e8af61" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.528310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qx94g" event={"ID":"b01c4848-8d37-4d3d-928c-c627cb4f3890","Type":"ContainerStarted","Data":"7bdd69ce3835618976f82194e2e5fb2ab35fbe52450d873e9c28663ce33c9734"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.552742 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" event={"ID":"8ae83d3d-4f6e-4a2a-80f0-8cc306c57a7c","Type":"ContainerStarted","Data":"283b8eeac038dc0d631b9f99c010293f5ce267b65f3e9cf7b1d42d39396d6043"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.553462 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.599064 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:53 crc kubenswrapper[4763]: E1006 14:55:53.600108 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.100086521 +0000 UTC m=+151.255379033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.648080 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" event={"ID":"81102bbe-a463-45e8-9fb2-dd27f0756db8","Type":"ContainerStarted","Data":"ac9f6e692f0f6b5e9cdb661e90400e075636c6ad2974061d78f87c2217116468"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.648123 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" event={"ID":"f551caab-0911-4a99-b78f-b6c9dd198d2e","Type":"ContainerStarted","Data":"459da026fbead5a9bfe36fc2d33bfecc8832393f5dd31459a055fc49dfbe4a46"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.648188 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.661243 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qvml8" event={"ID":"0420693a-911e-43d1-830f-cad488328368","Type":"ContainerStarted","Data":"50312fc546496f102b68063c9bed0a527325230e5df8a93b13be3e6047307019"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.679095 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6b2lk" podStartSLOduration=128.679078559 podStartE2EDuration="2m8.679078559s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:53.605715554 +0000 UTC m=+150.761008066" watchObservedRunningTime="2025-10-06 14:55:53.679078559 +0000 UTC m=+150.834371071" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.693976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" event={"ID":"519076e5-c5f8-4122-aa97-7941d40204dc","Type":"ContainerStarted","Data":"2aa6120bfdb76736d77d7fa71acb8135eb16d25690a3243bc1b5733af5da799a"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.706482 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" event={"ID":"b6ac57b5-83eb-472e-8dba-1806968a91bf","Type":"ContainerStarted","Data":"6a0f38cbb37febf0dddc13cb35d75fd0452e72681ec7ccd10b3551b923974a59"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.706540 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:53 crc kubenswrapper[4763]: E1006 14:55:53.710490 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.210477712 +0000 UTC m=+151.365770224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.742943 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" event={"ID":"4866322c-ee16-49bd-a037-17189761a083","Type":"ContainerStarted","Data":"16b324a05777c6654a20220344ab2017545c293518c11967a85250ed0fd70ad5"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.743003 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" event={"ID":"4866322c-ee16-49bd-a037-17189761a083","Type":"ContainerStarted","Data":"448ddd9ae3bee8a4230e7fb13fb3c36ee6dc4b475ba276b7e7cd8fdac9dd2dc7"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.768704 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" event={"ID":"00fdffba-67a3-4bfa-a996-c13403e0631a","Type":"ContainerStarted","Data":"41ce4fe4388378744a934402c15b25a1fc74c4f4aaa1482e24ba91c2d842fd8a"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.771534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" event={"ID":"f98a04d2-8411-4a25-b4ae-030305025e74","Type":"ContainerStarted","Data":"c1d51fd42880e4a706b68418c2f43d31e9545c3d6a8601c7d510c8bc20130b20"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.771560 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" event={"ID":"f98a04d2-8411-4a25-b4ae-030305025e74","Type":"ContainerStarted","Data":"98f63b1ab65c5d53e73621e3477cb73b448c0decd4e0a31f7fbc0be07d6a06ed"} Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.847075 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:53 crc kubenswrapper[4763]: E1006 14:55:53.847389 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.347370155 +0000 UTC m=+151.502662667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.851441 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:53 crc kubenswrapper[4763]: E1006 14:55:53.858831 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.358815618 +0000 UTC m=+151.514108130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.933818 4763 patch_prober.go:28] interesting pod/router-default-5444994796-rtxf5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 14:55:53 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Oct 06 14:55:53 crc kubenswrapper[4763]: [+]process-running ok Oct 06 14:55:53 crc kubenswrapper[4763]: healthz check failed Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.933878 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtxf5" podUID="4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 14:55:53 crc kubenswrapper[4763]: I1006 14:55:53.956363 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:53 crc kubenswrapper[4763]: E1006 14:55:53.957653 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.457637953 +0000 UTC m=+151.612930465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.022775 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8sn9" podStartSLOduration=129.022762078 podStartE2EDuration="2m9.022762078s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:54.021771899 +0000 UTC m=+151.177064411" watchObservedRunningTime="2025-10-06 14:55:54.022762078 +0000 UTC m=+151.178054590" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.057346 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.057715 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.557703404 +0000 UTC m=+151.712995916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.081091 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7zwvq" podStartSLOduration=129.081073344 podStartE2EDuration="2m9.081073344s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:54.045803728 +0000 UTC m=+151.201096240" watchObservedRunningTime="2025-10-06 14:55:54.081073344 +0000 UTC m=+151.236365856" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.111712 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zphtt" podStartSLOduration=129.111696065 podStartE2EDuration="2m9.111696065s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:54.109745958 +0000 UTC m=+151.265038470" watchObservedRunningTime="2025-10-06 14:55:54.111696065 +0000 UTC m=+151.266988577" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.143966 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8t2s" podStartSLOduration=129.143949793 podStartE2EDuration="2m9.143949793s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:54.134281642 +0000 UTC m=+151.289574154" watchObservedRunningTime="2025-10-06 14:55:54.143949793 +0000 UTC m=+151.299242305" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.161155 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.161527 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.661512004 +0000 UTC m=+151.816804516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.230913 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xmg6k" podStartSLOduration=129.230898563 podStartE2EDuration="2m9.230898563s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:54.176528021 +0000 UTC m=+151.331820533" watchObservedRunningTime="2025-10-06 14:55:54.230898563 +0000 UTC m=+151.386191075" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.231278 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" podStartSLOduration=129.231271634 podStartE2EDuration="2m9.231271634s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:54.225507916 +0000 UTC m=+151.380800428" watchObservedRunningTime="2025-10-06 14:55:54.231271634 +0000 UTC m=+151.386564146" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.262422 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.262797 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.76277622 +0000 UTC m=+151.918068792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.364696 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.364891 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.86486252 +0000 UTC m=+152.020155032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.365541 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.365862 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.865853559 +0000 UTC m=+152.021146071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.467152 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.467555 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:54.967534468 +0000 UTC m=+152.122826980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.568566 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.568943 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.068932288 +0000 UTC m=+152.224224800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.669929 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.670239 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.170224855 +0000 UTC m=+152.325517367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.771060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.771416 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.271403398 +0000 UTC m=+152.426695910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.777061 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9h5w7" event={"ID":"a068e2ad-5184-4bdd-9585-10a5451a7c3f","Type":"ContainerStarted","Data":"23d6775b5dffa4e2abf76ab3a7e1aab42605bae7bcc147b104eeffe1bf0fc62c"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.777116 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9h5w7" event={"ID":"a068e2ad-5184-4bdd-9585-10a5451a7c3f","Type":"ContainerStarted","Data":"9e147697d1ea47f7397cc00d3f1d151d48de9c56b9c7c583eb6f4c0b2903a952"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.777168 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9h5w7" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.779131 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" event={"ID":"f551caab-0911-4a99-b78f-b6c9dd198d2e","Type":"ContainerStarted","Data":"9fa0edef7ff3ebb548d72fb8a43eb0ab3ace847c9ff1b7432ca56482b3994c61"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.779174 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" event={"ID":"f551caab-0911-4a99-b78f-b6c9dd198d2e","Type":"ContainerStarted","Data":"b166c6f240341991bd7a449c0fa28002f8adc671e76c0d3e23d9cfab1c467f6f"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.781018 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mljxf" event={"ID":"81102bbe-a463-45e8-9fb2-dd27f0756db8","Type":"ContainerStarted","Data":"cdea0113eb03ce7ba5c6a00d497c4cee951f7f0a727a79d0362b43afac0b5037"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.785123 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" event={"ID":"7932c1b6-6f13-4b40-a720-2753116df818","Type":"ContainerStarted","Data":"fe54835d93d34ac5b88dcc0be4dce7d504eb8733b0f870952055213225f1a35f"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.787831 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" event={"ID":"59b2923d-9f63-412c-8f41-5f92f9258163","Type":"ContainerStarted","Data":"7ae15c0516626a12640efa83625d6064835ba45559a2a806698750064b20c35c"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.791284 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1711ed2f6f7eed4daa280158faf613363de98f3b8f97fa8d65f82ca267f99dac"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.791336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"be9c8386bc9ad572cc338507ce8262e4cc35fa27c60ca25f904f9a4b8d9b65b0"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.794331 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qx94g" event={"ID":"b01c4848-8d37-4d3d-928c-c627cb4f3890","Type":"ContainerStarted","Data":"83b8fedfde3a6f93de00b68e6e37305552942681c115b73d3975f69ae66838f5"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.796068 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"112100db298de0c412b9ff3a84c3756272751d71812fbb7b999a50f317ba92c3"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.796101 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c4b83433e7940e621253607301c3b526092336efb9da6b878a94e2479bda6360"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.796672 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.799534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"931f2812bdf5d7950b557961a01b8c4141010b717761196d4c85a25f7b8e9391"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.799574 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d7cd395b511ebf747a9cd7afaaf1df56b07435bf9999008a4fb15fb43be3296a"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.802516 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xffns" event={"ID":"f421ec41-8fa4-4d6f-a018-75409ad7dd84","Type":"ContainerStarted","Data":"255cd16ba5e9ff1ac953b7f980ed7b5f4c3e208d4fb56006730db88b0cf23e76"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.806573 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" event={"ID":"d100b2f0-e66f-4f24-92b6-4fc3ce5d4cd8","Type":"ContainerStarted","Data":"4fc4534222d7920d2625e781d831da57e48a7cc181a6ac74d3c573fbb53ea96d"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.807139 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9h5w7" podStartSLOduration=8.807124247 podStartE2EDuration="8.807124247s" podCreationTimestamp="2025-10-06 14:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:54.805262313 +0000 UTC m=+151.960554825" watchObservedRunningTime="2025-10-06 14:55:54.807124247 +0000 UTC m=+151.962416759" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.809697 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6mbnl" event={"ID":"db943839-6479-48dd-baa3-e23662c89494","Type":"ContainerStarted","Data":"0e2d06e111f02c065e86dd0ea5b868bf816173bd2e74a5a6a62fee401476a513"} Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.812339 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.814278 4763 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zmrz8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.814328 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" podUID="ef7f0e05-29c2-4890-8d13-0466593e1fa8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.820559 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dq69v" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.851336 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cfn9t" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.872274 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.872469 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.372422497 +0000 UTC m=+152.527715009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.873400 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.878805 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.378663129 +0000 UTC m=+152.533955641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.897177 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6sqtv" podStartSLOduration=129.897160097 podStartE2EDuration="2m9.897160097s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:54.858593455 +0000 UTC m=+152.013885967" watchObservedRunningTime="2025-10-06 14:55:54.897160097 +0000 UTC m=+152.052452609" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.928989 4763 patch_prober.go:28] interesting pod/router-default-5444994796-rtxf5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 14:55:54 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Oct 06 14:55:54 crc kubenswrapper[4763]: [+]process-running ok Oct 06 14:55:54 crc kubenswrapper[4763]: healthz check failed Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.929069 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtxf5" podUID="4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.931788 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" podStartSLOduration=129.931770534 podStartE2EDuration="2m9.931770534s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:54.93060325 +0000 UTC m=+152.085895762" watchObservedRunningTime="2025-10-06 14:55:54.931770534 +0000 UTC m=+152.087063046" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.960954 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hksqs" podStartSLOduration=129.960940432 podStartE2EDuration="2m9.960940432s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:54.944567246 +0000 UTC m=+152.099859748" watchObservedRunningTime="2025-10-06 14:55:54.960940432 +0000 UTC m=+152.116232944" Oct 06 14:55:54 crc kubenswrapper[4763]: I1006 14:55:54.995027 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:54 crc kubenswrapper[4763]: E1006 14:55:54.995311 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.495297332 +0000 UTC m=+152.650589844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.098125 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:55 crc kubenswrapper[4763]: E1006 14:55:55.098493 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.598468994 +0000 UTC m=+152.753761506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.199918 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:55 crc kubenswrapper[4763]: E1006 14:55:55.200093 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.700066639 +0000 UTC m=+152.855359161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.200162 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:55 crc kubenswrapper[4763]: E1006 14:55:55.200499 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.700492302 +0000 UTC m=+152.855784814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.270116 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-82t6f" podStartSLOduration=130.270098227 podStartE2EDuration="2m10.270098227s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:55.033543815 +0000 UTC m=+152.188836327" watchObservedRunningTime="2025-10-06 14:55:55.270098227 +0000 UTC m=+152.425390739" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.271781 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l5zxt"] Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.272678 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.277575 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.286340 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l5zxt"] Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.300943 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.301183 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wfzl\" (UniqueName: \"kubernetes.io/projected/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-kube-api-access-2wfzl\") pod \"community-operators-l5zxt\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.301240 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-utilities\") pod \"community-operators-l5zxt\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.301289 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-catalog-content\") pod \"community-operators-l5zxt\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: E1006 14:55:55.301388 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.801373857 +0000 UTC m=+152.956666369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.402880 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-catalog-content\") pod \"community-operators-l5zxt\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.403140 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.403298 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wfzl\" (UniqueName: \"kubernetes.io/projected/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-kube-api-access-2wfzl\") pod \"community-operators-l5zxt\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.403435 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-utilities\") pod \"community-operators-l5zxt\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: E1006 14:55:55.403474 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:55.903458807 +0000 UTC m=+153.058751319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.404040 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-catalog-content\") pod \"community-operators-l5zxt\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.404211 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-utilities\") pod \"community-operators-l5zxt\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.430252 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wfzl\" (UniqueName: \"kubernetes.io/projected/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-kube-api-access-2wfzl\") pod \"community-operators-l5zxt\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.457463 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7cf9s"] Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.458463 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.462341 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.468175 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7cf9s"] Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.505004 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.505196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zh9w\" (UniqueName: \"kubernetes.io/projected/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-kube-api-access-6zh9w\") pod \"certified-operators-7cf9s\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.505236 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-utilities\") pod \"certified-operators-7cf9s\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.505279 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-catalog-content\") pod \"certified-operators-7cf9s\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: E1006 14:55:55.505396 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:56.005381092 +0000 UTC m=+153.160673604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.522494 4763 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.590392 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.606034 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zh9w\" (UniqueName: \"kubernetes.io/projected/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-kube-api-access-6zh9w\") pod \"certified-operators-7cf9s\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.606106 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-utilities\") pod \"certified-operators-7cf9s\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.606168 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-catalog-content\") pod \"certified-operators-7cf9s\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.606190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:55 crc kubenswrapper[4763]: E1006 14:55:55.606492 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 14:55:56.106481353 +0000 UTC m=+153.261773865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9wh2c" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.607913 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-utilities\") pod \"certified-operators-7cf9s\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.608739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-catalog-content\") pod \"certified-operators-7cf9s\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.623343 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zh9w\" (UniqueName: \"kubernetes.io/projected/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-kube-api-access-6zh9w\") pod \"certified-operators-7cf9s\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.672680 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-thfpq"] Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.673747 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.685160 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-thfpq"] Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.711071 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.711296 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-utilities\") pod \"community-operators-thfpq\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.711339 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mps8g\" (UniqueName: \"kubernetes.io/projected/d77c108d-3366-4d93-a5a7-c59db5afaca0-kube-api-access-mps8g\") pod \"community-operators-thfpq\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.711365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-catalog-content\") pod \"community-operators-thfpq\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:55 crc kubenswrapper[4763]: E1006 14:55:55.711708 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 14:55:56.211690304 +0000 UTC m=+153.366982816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.733480 4763 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-06T14:55:55.522517091Z","Handler":null,"Name":""} Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.740736 4763 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.740768 4763 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.775736 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.814478 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mps8g\" (UniqueName: \"kubernetes.io/projected/d77c108d-3366-4d93-a5a7-c59db5afaca0-kube-api-access-mps8g\") pod \"community-operators-thfpq\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.814525 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-catalog-content\") pod \"community-operators-thfpq\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.814594 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.814692 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-utilities\") pod \"community-operators-thfpq\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.815117 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-catalog-content\") pod \"community-operators-thfpq\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.815227 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-utilities\") pod \"community-operators-thfpq\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.816960 4763 generic.go:334] "Generic (PLEG): container finished" podID="6b243bd7-367c-41e7-9101-981ed6d10a13" containerID="f8e75e84c9eadf077b260ee87240d74c4238eb9a17823964c806038b34446ae8" exitCode=0 Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.817012 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" event={"ID":"6b243bd7-367c-41e7-9101-981ed6d10a13","Type":"ContainerDied","Data":"f8e75e84c9eadf077b260ee87240d74c4238eb9a17823964c806038b34446ae8"} Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.821163 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.821202 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.823957 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qx94g" event={"ID":"b01c4848-8d37-4d3d-928c-c627cb4f3890","Type":"ContainerStarted","Data":"c74cd40778f804a121125f3085b7590f532dc1bb4a5c00ddd50d9ca8c77bc9c5"} Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.823993 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qx94g" event={"ID":"b01c4848-8d37-4d3d-928c-c627cb4f3890","Type":"ContainerStarted","Data":"bd6e46c0786fadf6ada14bd46980532d3ac36989df944868b8c9c24b9f35148a"} Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.824004 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qx94g" event={"ID":"b01c4848-8d37-4d3d-928c-c627cb4f3890","Type":"ContainerStarted","Data":"cec9d6b13894acf0af33d844e0a3b2f5d2ae6e2e748f7f75b68333440f7253f2"} Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.836328 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.838329 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mps8g\" (UniqueName: \"kubernetes.io/projected/d77c108d-3366-4d93-a5a7-c59db5afaca0-kube-api-access-mps8g\") pod \"community-operators-thfpq\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.846659 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l5zxt"] Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.849559 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9wh2c\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:55 crc kubenswrapper[4763]: W1006 14:55:55.851061 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode364490a_9e0b_45fb_bbf3_f6c603f7e9ca.slice/crio-d56ef74d6b509967d9caa0310542280aa1c7ef3e142e0d6567e701db9c167a66 WatchSource:0}: Error finding container d56ef74d6b509967d9caa0310542280aa1c7ef3e142e0d6567e701db9c167a66: Status 404 returned error can't find the container with id d56ef74d6b509967d9caa0310542280aa1c7ef3e142e0d6567e701db9c167a66 Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.864265 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-qx94g" podStartSLOduration=9.864246563 podStartE2EDuration="9.864246563s" podCreationTimestamp="2025-10-06 14:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:55.861027659 +0000 UTC m=+153.016320171" watchObservedRunningTime="2025-10-06 14:55:55.864246563 +0000 UTC m=+153.019539075" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.866559 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hnlh"] Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.867422 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.879516 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hnlh"] Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.915268 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.915661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-catalog-content\") pod \"certified-operators-2hnlh\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.915952 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qjml\" (UniqueName: \"kubernetes.io/projected/1150f7d8-8918-497a-9166-869e2fd2eb04-kube-api-access-8qjml\") pod \"certified-operators-2hnlh\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.915996 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-utilities\") pod \"certified-operators-2hnlh\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.925977 4763 patch_prober.go:28] interesting pod/router-default-5444994796-rtxf5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 14:55:55 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Oct 06 14:55:55 crc kubenswrapper[4763]: [+]process-running ok Oct 06 14:55:55 crc kubenswrapper[4763]: healthz check failed Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.926023 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtxf5" podUID="4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 14:55:55 crc kubenswrapper[4763]: I1006 14:55:55.926405 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.002445 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7cf9s"] Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.017294 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qjml\" (UniqueName: \"kubernetes.io/projected/1150f7d8-8918-497a-9166-869e2fd2eb04-kube-api-access-8qjml\") pod \"certified-operators-2hnlh\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.017345 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-utilities\") pod \"certified-operators-2hnlh\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.017379 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-catalog-content\") pod \"certified-operators-2hnlh\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.017747 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-utilities\") pod \"certified-operators-2hnlh\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.017853 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-catalog-content\") pod \"certified-operators-2hnlh\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.033891 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.038279 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qjml\" (UniqueName: \"kubernetes.io/projected/1150f7d8-8918-497a-9166-869e2fd2eb04-kube-api-access-8qjml\") pod \"certified-operators-2hnlh\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.062901 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.207562 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.324869 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9wh2c"] Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.393450 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-thfpq"] Oct 06 14:55:56 crc kubenswrapper[4763]: W1006 14:55:56.416044 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd77c108d_3366_4d93_a5a7_c59db5afaca0.slice/crio-17e534aceb03545c0998cf333dde7e9b55c1391218fee93287e1bb69e393a909 WatchSource:0}: Error finding container 17e534aceb03545c0998cf333dde7e9b55c1391218fee93287e1bb69e393a909: Status 404 returned error can't find the container with id 17e534aceb03545c0998cf333dde7e9b55c1391218fee93287e1bb69e393a909 Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.430986 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hnlh"] Oct 06 14:55:56 crc kubenswrapper[4763]: W1006 14:55:56.440730 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1150f7d8_8918_497a_9166_869e2fd2eb04.slice/crio-cbab0e5f3e1e5806a79770b6913bfa2b48d2c9e3c28130f27890923783110c08 WatchSource:0}: Error finding container cbab0e5f3e1e5806a79770b6913bfa2b48d2c9e3c28130f27890923783110c08: Status 404 returned error can't find the container with id cbab0e5f3e1e5806a79770b6913bfa2b48d2c9e3c28130f27890923783110c08 Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.829469 4763 generic.go:334] "Generic (PLEG): container finished" podID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerID="fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712" exitCode=0 Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.830038 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thfpq" event={"ID":"d77c108d-3366-4d93-a5a7-c59db5afaca0","Type":"ContainerDied","Data":"fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712"} Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.830067 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thfpq" event={"ID":"d77c108d-3366-4d93-a5a7-c59db5afaca0","Type":"ContainerStarted","Data":"17e534aceb03545c0998cf333dde7e9b55c1391218fee93287e1bb69e393a909"} Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.831017 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.831863 4763 generic.go:334] "Generic (PLEG): container finished" podID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerID="234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2" exitCode=0 Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.831915 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hnlh" event={"ID":"1150f7d8-8918-497a-9166-869e2fd2eb04","Type":"ContainerDied","Data":"234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2"} Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.831939 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hnlh" event={"ID":"1150f7d8-8918-497a-9166-869e2fd2eb04","Type":"ContainerStarted","Data":"cbab0e5f3e1e5806a79770b6913bfa2b48d2c9e3c28130f27890923783110c08"} Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.834073 4763 generic.go:334] "Generic (PLEG): container finished" podID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerID="952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f" exitCode=0 Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.834116 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5zxt" event={"ID":"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca","Type":"ContainerDied","Data":"952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f"} Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.834869 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5zxt" event={"ID":"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca","Type":"ContainerStarted","Data":"d56ef74d6b509967d9caa0310542280aa1c7ef3e142e0d6567e701db9c167a66"} Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.850134 4763 generic.go:334] "Generic (PLEG): container finished" podID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerID="a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6" exitCode=0 Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.850189 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cf9s" event={"ID":"d61abe4d-a32e-4520-a8a5-bb9f111a7c28","Type":"ContainerDied","Data":"a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6"} Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.850211 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cf9s" event={"ID":"d61abe4d-a32e-4520-a8a5-bb9f111a7c28","Type":"ContainerStarted","Data":"2137dd38c0bdf410dceee4bc5eb93b2174dd2d98fa999a4e1be0ec02a13cf137"} Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.854286 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" event={"ID":"4f380f6e-5ecf-460d-b7ec-9e7c36c21326","Type":"ContainerStarted","Data":"9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef"} Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.854304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" event={"ID":"4f380f6e-5ecf-460d-b7ec-9e7c36c21326","Type":"ContainerStarted","Data":"0a1a3ed42c8058fe1a4a9dc68e66ddae950334884f7de07979d3253f19e30976"} Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.926950 4763 patch_prober.go:28] interesting pod/router-default-5444994796-rtxf5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 14:55:56 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Oct 06 14:55:56 crc kubenswrapper[4763]: [+]process-running ok Oct 06 14:55:56 crc kubenswrapper[4763]: healthz check failed Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.927025 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtxf5" podUID="4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 14:55:56 crc kubenswrapper[4763]: I1006 14:55:56.960112 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" podStartSLOduration=131.960090344 podStartE2EDuration="2m11.960090344s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:56.958467056 +0000 UTC m=+154.113759578" watchObservedRunningTime="2025-10-06 14:55:56.960090344 +0000 UTC m=+154.115382856" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.055989 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.128882 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slcz4\" (UniqueName: \"kubernetes.io/projected/6b243bd7-367c-41e7-9101-981ed6d10a13-kube-api-access-slcz4\") pod \"6b243bd7-367c-41e7-9101-981ed6d10a13\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.128968 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b243bd7-367c-41e7-9101-981ed6d10a13-config-volume\") pod \"6b243bd7-367c-41e7-9101-981ed6d10a13\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.129012 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b243bd7-367c-41e7-9101-981ed6d10a13-secret-volume\") pod \"6b243bd7-367c-41e7-9101-981ed6d10a13\" (UID: \"6b243bd7-367c-41e7-9101-981ed6d10a13\") " Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.130790 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b243bd7-367c-41e7-9101-981ed6d10a13-config-volume" (OuterVolumeSpecName: "config-volume") pod "6b243bd7-367c-41e7-9101-981ed6d10a13" (UID: "6b243bd7-367c-41e7-9101-981ed6d10a13"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.141974 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b243bd7-367c-41e7-9101-981ed6d10a13-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6b243bd7-367c-41e7-9101-981ed6d10a13" (UID: "6b243bd7-367c-41e7-9101-981ed6d10a13"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.156258 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b243bd7-367c-41e7-9101-981ed6d10a13-kube-api-access-slcz4" (OuterVolumeSpecName: "kube-api-access-slcz4") pod "6b243bd7-367c-41e7-9101-981ed6d10a13" (UID: "6b243bd7-367c-41e7-9101-981ed6d10a13"). InnerVolumeSpecName "kube-api-access-slcz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.229982 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slcz4\" (UniqueName: \"kubernetes.io/projected/6b243bd7-367c-41e7-9101-981ed6d10a13-kube-api-access-slcz4\") on node \"crc\" DevicePath \"\"" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.230018 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b243bd7-367c-41e7-9101-981ed6d10a13-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.230030 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b243bd7-367c-41e7-9101-981ed6d10a13-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.394437 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 14:55:57 crc kubenswrapper[4763]: E1006 14:55:57.394638 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b243bd7-367c-41e7-9101-981ed6d10a13" containerName="collect-profiles" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.394650 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b243bd7-367c-41e7-9101-981ed6d10a13" containerName="collect-profiles" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.394749 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b243bd7-367c-41e7-9101-981ed6d10a13" containerName="collect-profiles" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.395117 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.398094 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.398387 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.407039 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.431684 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e26213b6-9e33-4bd3-9a9c-cd07b049cf17\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.431737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e26213b6-9e33-4bd3-9a9c-cd07b049cf17\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.466699 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zd9hq"] Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.467636 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.472743 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.484080 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zd9hq"] Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.533351 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-utilities\") pod \"redhat-marketplace-zd9hq\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.533422 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e26213b6-9e33-4bd3-9a9c-cd07b049cf17\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.533677 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-catalog-content\") pod \"redhat-marketplace-zd9hq\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.533762 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8g8h\" (UniqueName: \"kubernetes.io/projected/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-kube-api-access-c8g8h\") pod \"redhat-marketplace-zd9hq\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.533838 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e26213b6-9e33-4bd3-9a9c-cd07b049cf17\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.533933 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e26213b6-9e33-4bd3-9a9c-cd07b049cf17\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.549884 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e26213b6-9e33-4bd3-9a9c-cd07b049cf17\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.586098 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.635216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-catalog-content\") pod \"redhat-marketplace-zd9hq\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.635529 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8g8h\" (UniqueName: \"kubernetes.io/projected/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-kube-api-access-c8g8h\") pod \"redhat-marketplace-zd9hq\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.635733 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-utilities\") pod \"redhat-marketplace-zd9hq\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.635938 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-catalog-content\") pod \"redhat-marketplace-zd9hq\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.636307 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-utilities\") pod \"redhat-marketplace-zd9hq\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.656994 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8g8h\" (UniqueName: \"kubernetes.io/projected/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-kube-api-access-c8g8h\") pod \"redhat-marketplace-zd9hq\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.724042 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.782192 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.787570 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xlb8f" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.792847 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.827892 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-7n989 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.827946 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-7n989 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.827952 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7n989" podUID="9ab5712d-bdff-43e3-b9ab-26d452c17259" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.828008 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7n989" podUID="9ab5712d-bdff-43e3-b9ab-26d452c17259" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.885516 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.885539 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5" event={"ID":"6b243bd7-367c-41e7-9101-981ed6d10a13","Type":"ContainerDied","Data":"bc57257bae14c9312fbf717cfea1ada5eed0e26f48ae45292419f96e85afa23b"} Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.885562 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc57257bae14c9312fbf717cfea1ada5eed0e26f48ae45292419f96e85afa23b" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.885957 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.897113 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-88kng"] Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.905169 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.907737 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88kng"] Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.928166 4763 patch_prober.go:28] interesting pod/router-default-5444994796-rtxf5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 14:55:57 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Oct 06 14:55:57 crc kubenswrapper[4763]: [+]process-running ok Oct 06 14:55:57 crc kubenswrapper[4763]: healthz check failed Oct 06 14:55:57 crc kubenswrapper[4763]: I1006 14:55:57.928232 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtxf5" podUID="4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.047452 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-catalog-content\") pod \"redhat-marketplace-88kng\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.047548 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-utilities\") pod \"redhat-marketplace-88kng\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.047670 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2bsn\" (UniqueName: \"kubernetes.io/projected/9740294c-47bf-442d-b72c-e58ee8877f3b-kube-api-access-m2bsn\") pod \"redhat-marketplace-88kng\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.082316 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 14:55:58 crc kubenswrapper[4763]: W1006 14:55:58.086126 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode26213b6_9e33_4bd3_9a9c_cd07b049cf17.slice/crio-d520097dfbeb89825df60a956649594129a500c7d2e0a3ea17fba8f6ba298a74 WatchSource:0}: Error finding container d520097dfbeb89825df60a956649594129a500c7d2e0a3ea17fba8f6ba298a74: Status 404 returned error can't find the container with id d520097dfbeb89825df60a956649594129a500c7d2e0a3ea17fba8f6ba298a74 Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.126581 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zd9hq"] Oct 06 14:55:58 crc kubenswrapper[4763]: W1006 14:55:58.141265 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6c90a53_6c57_4fd5_b9d7_77243c584cc4.slice/crio-3ff71cd77a97b853ea94808baaceb713ce930459390bcca16b8eec1b8cf60db6 WatchSource:0}: Error finding container 3ff71cd77a97b853ea94808baaceb713ce930459390bcca16b8eec1b8cf60db6: Status 404 returned error can't find the container with id 3ff71cd77a97b853ea94808baaceb713ce930459390bcca16b8eec1b8cf60db6 Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.149523 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2bsn\" (UniqueName: \"kubernetes.io/projected/9740294c-47bf-442d-b72c-e58ee8877f3b-kube-api-access-m2bsn\") pod \"redhat-marketplace-88kng\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.149653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-catalog-content\") pod \"redhat-marketplace-88kng\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.149731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-utilities\") pod \"redhat-marketplace-88kng\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.152645 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-utilities\") pod \"redhat-marketplace-88kng\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.153598 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-catalog-content\") pod \"redhat-marketplace-88kng\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.169722 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2bsn\" (UniqueName: \"kubernetes.io/projected/9740294c-47bf-442d-b72c-e58ee8877f3b-kube-api-access-m2bsn\") pod \"redhat-marketplace-88kng\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.263977 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.472211 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nczcd"] Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.473657 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.482213 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nczcd"] Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.482313 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 14:55:58 crc kubenswrapper[4763]: W1006 14:55:58.531695 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9740294c_47bf_442d_b72c_e58ee8877f3b.slice/crio-126fd65f897b2ce35051b58f55536eb98afe6da412f4dcf64b28770e3f5da010 WatchSource:0}: Error finding container 126fd65f897b2ce35051b58f55536eb98afe6da412f4dcf64b28770e3f5da010: Status 404 returned error can't find the container with id 126fd65f897b2ce35051b58f55536eb98afe6da412f4dcf64b28770e3f5da010 Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.533098 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88kng"] Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.656303 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-utilities\") pod \"redhat-operators-nczcd\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.656370 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smqhw\" (UniqueName: \"kubernetes.io/projected/26eaff77-0592-4caa-8fa9-f2ad86445c41-kube-api-access-smqhw\") pod \"redhat-operators-nczcd\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.656424 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-catalog-content\") pod \"redhat-operators-nczcd\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.757666 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-utilities\") pod \"redhat-operators-nczcd\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.757732 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smqhw\" (UniqueName: \"kubernetes.io/projected/26eaff77-0592-4caa-8fa9-f2ad86445c41-kube-api-access-smqhw\") pod \"redhat-operators-nczcd\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.757784 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-catalog-content\") pod \"redhat-operators-nczcd\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.758169 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-utilities\") pod \"redhat-operators-nczcd\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.758257 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-catalog-content\") pod \"redhat-operators-nczcd\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.774997 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smqhw\" (UniqueName: \"kubernetes.io/projected/26eaff77-0592-4caa-8fa9-f2ad86445c41-kube-api-access-smqhw\") pod \"redhat-operators-nczcd\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.835530 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.859490 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vlxmh"] Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.860399 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.870388 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vlxmh"] Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.896543 4763 generic.go:334] "Generic (PLEG): container finished" podID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerID="cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436" exitCode=0 Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.897200 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zd9hq" event={"ID":"c6c90a53-6c57-4fd5-b9d7-77243c584cc4","Type":"ContainerDied","Data":"cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436"} Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.897222 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zd9hq" event={"ID":"c6c90a53-6c57-4fd5-b9d7-77243c584cc4","Type":"ContainerStarted","Data":"3ff71cd77a97b853ea94808baaceb713ce930459390bcca16b8eec1b8cf60db6"} Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.900631 4763 generic.go:334] "Generic (PLEG): container finished" podID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerID="a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c" exitCode=0 Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.900675 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88kng" event={"ID":"9740294c-47bf-442d-b72c-e58ee8877f3b","Type":"ContainerDied","Data":"a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c"} Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.900696 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88kng" event={"ID":"9740294c-47bf-442d-b72c-e58ee8877f3b","Type":"ContainerStarted","Data":"126fd65f897b2ce35051b58f55536eb98afe6da412f4dcf64b28770e3f5da010"} Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.914913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e26213b6-9e33-4bd3-9a9c-cd07b049cf17","Type":"ContainerStarted","Data":"281aecc7efcdb8b5365b128e3a6196db613c35e6f4eec6c2ab9f7461f6478df8"} Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.914966 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e26213b6-9e33-4bd3-9a9c-cd07b049cf17","Type":"ContainerStarted","Data":"d520097dfbeb89825df60a956649594129a500c7d2e0a3ea17fba8f6ba298a74"} Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.927458 4763 patch_prober.go:28] interesting pod/router-default-5444994796-rtxf5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 14:55:58 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Oct 06 14:55:58 crc kubenswrapper[4763]: [+]process-running ok Oct 06 14:55:58 crc kubenswrapper[4763]: healthz check failed Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.927854 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtxf5" podUID="4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.961115 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-utilities\") pod \"redhat-operators-vlxmh\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.961180 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-catalog-content\") pod \"redhat-operators-vlxmh\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:58 crc kubenswrapper[4763]: I1006 14:55:58.961200 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xwgk\" (UniqueName: \"kubernetes.io/projected/f6759086-6fd8-4989-8418-2d443118ea98-kube-api-access-6xwgk\") pod \"redhat-operators-vlxmh\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.063201 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-catalog-content\") pod \"redhat-operators-vlxmh\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.063476 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xwgk\" (UniqueName: \"kubernetes.io/projected/f6759086-6fd8-4989-8418-2d443118ea98-kube-api-access-6xwgk\") pod \"redhat-operators-vlxmh\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.063569 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-utilities\") pod \"redhat-operators-vlxmh\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.065417 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-catalog-content\") pod \"redhat-operators-vlxmh\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.067168 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-utilities\") pod \"redhat-operators-vlxmh\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.088097 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xwgk\" (UniqueName: \"kubernetes.io/projected/f6759086-6fd8-4989-8418-2d443118ea98-kube-api-access-6xwgk\") pod \"redhat-operators-vlxmh\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.184442 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.184409767 podStartE2EDuration="2.184409767s" podCreationTimestamp="2025-10-06 14:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:55:58.956368002 +0000 UTC m=+156.111660504" watchObservedRunningTime="2025-10-06 14:55:59.184409767 +0000 UTC m=+156.339702279" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.187202 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nczcd"] Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.193082 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.425714 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.426003 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.426444 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vlxmh"] Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.451710 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.451791 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.453007 4763 patch_prober.go:28] interesting pod/console-f9d7485db-t2f4r container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.453052 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-t2f4r" podUID="fc281a39-2f6b-407d-a27e-0d19025186d7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.460813 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.923324 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.939816 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlxmh" event={"ID":"f6759086-6fd8-4989-8418-2d443118ea98","Type":"ContainerStarted","Data":"4a4b858487d9dd0aae15b68adde6668975df176fbf91a24ff0c0cac4e39b9a00"} Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.939993 4763 patch_prober.go:28] interesting pod/router-default-5444994796-rtxf5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 14:55:59 crc kubenswrapper[4763]: [+]has-synced ok Oct 06 14:55:59 crc kubenswrapper[4763]: [+]process-running ok Oct 06 14:55:59 crc kubenswrapper[4763]: healthz check failed Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.940023 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtxf5" podUID="4efa2f6b-8c71-41bf-90a8-b18f5cf8eb13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.943981 4763 generic.go:334] "Generic (PLEG): container finished" podID="e26213b6-9e33-4bd3-9a9c-cd07b049cf17" containerID="281aecc7efcdb8b5365b128e3a6196db613c35e6f4eec6c2ab9f7461f6478df8" exitCode=0 Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.944035 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e26213b6-9e33-4bd3-9a9c-cd07b049cf17","Type":"ContainerDied","Data":"281aecc7efcdb8b5365b128e3a6196db613c35e6f4eec6c2ab9f7461f6478df8"} Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.948916 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nczcd" event={"ID":"26eaff77-0592-4caa-8fa9-f2ad86445c41","Type":"ContainerStarted","Data":"0c1f608501dabda7f8f9a7bed575e37c2d1ad85bb96f96b1563a8e6e5af6e735"} Oct 06 14:55:59 crc kubenswrapper[4763]: I1006 14:55:59.954256 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jmmk4" Oct 06 14:56:00 crc kubenswrapper[4763]: I1006 14:56:00.925446 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:56:00 crc kubenswrapper[4763]: I1006 14:56:00.959218 4763 generic.go:334] "Generic (PLEG): container finished" podID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerID="92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490" exitCode=0 Oct 06 14:56:00 crc kubenswrapper[4763]: I1006 14:56:00.959311 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nczcd" event={"ID":"26eaff77-0592-4caa-8fa9-f2ad86445c41","Type":"ContainerDied","Data":"92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490"} Oct 06 14:56:00 crc kubenswrapper[4763]: I1006 14:56:00.962750 4763 generic.go:334] "Generic (PLEG): container finished" podID="f6759086-6fd8-4989-8418-2d443118ea98" containerID="c27700bc4c9a51ef9aaa0614e76ffb0968a3093d1fdaca3e79da02286d53752e" exitCode=0 Oct 06 14:56:00 crc kubenswrapper[4763]: I1006 14:56:00.962835 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlxmh" event={"ID":"f6759086-6fd8-4989-8418-2d443118ea98","Type":"ContainerDied","Data":"c27700bc4c9a51ef9aaa0614e76ffb0968a3093d1fdaca3e79da02286d53752e"} Oct 06 14:56:00 crc kubenswrapper[4763]: I1006 14:56:00.965203 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-rtxf5" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.309448 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.310438 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.314699 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.316564 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.316780 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.452839 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.452997 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.460321 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.554206 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kubelet-dir\") pod \"e26213b6-9e33-4bd3-9a9c-cd07b049cf17\" (UID: \"e26213b6-9e33-4bd3-9a9c-cd07b049cf17\") " Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.554296 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kube-api-access\") pod \"e26213b6-9e33-4bd3-9a9c-cd07b049cf17\" (UID: \"e26213b6-9e33-4bd3-9a9c-cd07b049cf17\") " Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.554405 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.554467 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.554536 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.554590 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e26213b6-9e33-4bd3-9a9c-cd07b049cf17" (UID: "e26213b6-9e33-4bd3-9a9c-cd07b049cf17"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.559746 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e26213b6-9e33-4bd3-9a9c-cd07b049cf17" (UID: "e26213b6-9e33-4bd3-9a9c-cd07b049cf17"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.572537 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.645859 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.664655 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.664951 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e26213b6-9e33-4bd3-9a9c-cd07b049cf17-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.986699 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e26213b6-9e33-4bd3-9a9c-cd07b049cf17","Type":"ContainerDied","Data":"d520097dfbeb89825df60a956649594129a500c7d2e0a3ea17fba8f6ba298a74"} Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.986761 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d520097dfbeb89825df60a956649594129a500c7d2e0a3ea17fba8f6ba298a74" Oct 06 14:56:01 crc kubenswrapper[4763]: I1006 14:56:01.986741 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 14:56:02 crc kubenswrapper[4763]: I1006 14:56:02.132767 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 14:56:03 crc kubenswrapper[4763]: I1006 14:56:03.023900 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b","Type":"ContainerStarted","Data":"35faf4b5a3eb8b0434783e03dc1fe4a7e829c0e67fe07b36c747868ad984cebb"} Oct 06 14:56:03 crc kubenswrapper[4763]: I1006 14:56:03.024246 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b","Type":"ContainerStarted","Data":"9aaa6a049f99aceb9c35dd6648348d0de4612e2336288a0e4e0b69c465ac3428"} Oct 06 14:56:03 crc kubenswrapper[4763]: I1006 14:56:03.041766 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.041740708 podStartE2EDuration="2.041740708s" podCreationTimestamp="2025-10-06 14:56:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:56:03.037815454 +0000 UTC m=+160.193107966" watchObservedRunningTime="2025-10-06 14:56:03.041740708 +0000 UTC m=+160.197033220" Oct 06 14:56:03 crc kubenswrapper[4763]: I1006 14:56:03.877399 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:56:03 crc kubenswrapper[4763]: I1006 14:56:03.877888 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:56:04 crc kubenswrapper[4763]: I1006 14:56:04.048879 4763 generic.go:334] "Generic (PLEG): container finished" podID="0fd53d5e-ae7a-49ac-a28f-ae041c628a4b" containerID="35faf4b5a3eb8b0434783e03dc1fe4a7e829c0e67fe07b36c747868ad984cebb" exitCode=0 Oct 06 14:56:04 crc kubenswrapper[4763]: I1006 14:56:04.048995 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b","Type":"ContainerDied","Data":"35faf4b5a3eb8b0434783e03dc1fe4a7e829c0e67fe07b36c747868ad984cebb"} Oct 06 14:56:05 crc kubenswrapper[4763]: I1006 14:56:05.000110 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9h5w7" Oct 06 14:56:07 crc kubenswrapper[4763]: I1006 14:56:07.832747 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-7n989" Oct 06 14:56:08 crc kubenswrapper[4763]: I1006 14:56:08.190222 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:56:08 crc kubenswrapper[4763]: I1006 14:56:08.212549 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6aeb0e7-db42-449d-8052-fc68154e93d2-metrics-certs\") pod \"network-metrics-daemon-hgd8l\" (UID: \"d6aeb0e7-db42-449d-8052-fc68154e93d2\") " pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:56:08 crc kubenswrapper[4763]: I1006 14:56:08.422560 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgd8l" Oct 06 14:56:09 crc kubenswrapper[4763]: I1006 14:56:09.661767 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:56:09 crc kubenswrapper[4763]: I1006 14:56:09.664710 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 14:56:11 crc kubenswrapper[4763]: I1006 14:56:11.485908 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 14:56:11 crc kubenswrapper[4763]: I1006 14:56:11.527109 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kube-api-access\") pod \"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b\" (UID: \"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b\") " Oct 06 14:56:11 crc kubenswrapper[4763]: I1006 14:56:11.527184 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kubelet-dir\") pod \"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b\" (UID: \"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b\") " Oct 06 14:56:11 crc kubenswrapper[4763]: I1006 14:56:11.527301 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0fd53d5e-ae7a-49ac-a28f-ae041c628a4b" (UID: "0fd53d5e-ae7a-49ac-a28f-ae041c628a4b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:56:11 crc kubenswrapper[4763]: I1006 14:56:11.527448 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:11 crc kubenswrapper[4763]: I1006 14:56:11.532296 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0fd53d5e-ae7a-49ac-a28f-ae041c628a4b" (UID: "0fd53d5e-ae7a-49ac-a28f-ae041c628a4b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:56:11 crc kubenswrapper[4763]: I1006 14:56:11.627924 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fd53d5e-ae7a-49ac-a28f-ae041c628a4b-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:12 crc kubenswrapper[4763]: I1006 14:56:12.116327 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0fd53d5e-ae7a-49ac-a28f-ae041c628a4b","Type":"ContainerDied","Data":"9aaa6a049f99aceb9c35dd6648348d0de4612e2336288a0e4e0b69c465ac3428"} Oct 06 14:56:12 crc kubenswrapper[4763]: I1006 14:56:12.116367 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aaa6a049f99aceb9c35dd6648348d0de4612e2336288a0e4e0b69c465ac3428" Oct 06 14:56:12 crc kubenswrapper[4763]: I1006 14:56:12.116460 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 14:56:16 crc kubenswrapper[4763]: I1006 14:56:16.068690 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.180371 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.181051 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-smqhw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-nczcd_openshift-marketplace(26eaff77-0592-4caa-8fa9-f2ad86445c41): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.182746 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-nczcd" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.556090 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.556349 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6zh9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7cf9s_openshift-marketplace(d61abe4d-a32e-4520-a8a5-bb9f111a7c28): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.558526 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7cf9s" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.646444 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-nczcd" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.729564 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.729731 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2bsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-88kng_openshift-marketplace(9740294c-47bf-442d-b72c-e58ee8877f3b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.731065 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-88kng" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.756799 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.757051 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wfzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l5zxt_openshift-marketplace(e364490a-9e0b-45fb-bbf3-f6c603f7e9ca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.758327 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l5zxt" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.761449 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.761557 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8g8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zd9hq_openshift-marketplace(c6c90a53-6c57-4fd5-b9d7-77243c584cc4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.763821 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zd9hq" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.823258 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.823678 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mps8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-thfpq_openshift-marketplace(d77c108d-3366-4d93-a5a7-c59db5afaca0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 14:56:25 crc kubenswrapper[4763]: E1006 14:56:25.825065 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-thfpq" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" Oct 06 14:56:25 crc kubenswrapper[4763]: I1006 14:56:25.909864 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hgd8l"] Oct 06 14:56:25 crc kubenswrapper[4763]: W1006 14:56:25.918402 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6aeb0e7_db42_449d_8052_fc68154e93d2.slice/crio-fa4bfbbd589ac65d337a038f26fd6b3c41f5b878d20b40130a55046977b27833 WatchSource:0}: Error finding container fa4bfbbd589ac65d337a038f26fd6b3c41f5b878d20b40130a55046977b27833: Status 404 returned error can't find the container with id fa4bfbbd589ac65d337a038f26fd6b3c41f5b878d20b40130a55046977b27833 Oct 06 14:56:26 crc kubenswrapper[4763]: I1006 14:56:26.205750 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" event={"ID":"d6aeb0e7-db42-449d-8052-fc68154e93d2","Type":"ContainerStarted","Data":"b44b9736f03234166feb80426f2afce14d2ba9316caf1a0e11cc9b16ee751216"} Oct 06 14:56:26 crc kubenswrapper[4763]: I1006 14:56:26.206116 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" event={"ID":"d6aeb0e7-db42-449d-8052-fc68154e93d2","Type":"ContainerStarted","Data":"fa4bfbbd589ac65d337a038f26fd6b3c41f5b878d20b40130a55046977b27833"} Oct 06 14:56:26 crc kubenswrapper[4763]: I1006 14:56:26.207575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlxmh" event={"ID":"f6759086-6fd8-4989-8418-2d443118ea98","Type":"ContainerStarted","Data":"0cccfaece243deabcf291227085b8b14f726b6cbdb5be4cd1400705dc06d5798"} Oct 06 14:56:26 crc kubenswrapper[4763]: I1006 14:56:26.212856 4763 generic.go:334] "Generic (PLEG): container finished" podID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerID="72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf" exitCode=0 Oct 06 14:56:26 crc kubenswrapper[4763]: I1006 14:56:26.213518 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hnlh" event={"ID":"1150f7d8-8918-497a-9166-869e2fd2eb04","Type":"ContainerDied","Data":"72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf"} Oct 06 14:56:26 crc kubenswrapper[4763]: E1006 14:56:26.218071 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l5zxt" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" Oct 06 14:56:26 crc kubenswrapper[4763]: E1006 14:56:26.218324 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-88kng" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" Oct 06 14:56:26 crc kubenswrapper[4763]: E1006 14:56:26.218748 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7cf9s" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" Oct 06 14:56:26 crc kubenswrapper[4763]: E1006 14:56:26.218797 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zd9hq" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" Oct 06 14:56:26 crc kubenswrapper[4763]: E1006 14:56:26.218838 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-thfpq" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" Oct 06 14:56:27 crc kubenswrapper[4763]: I1006 14:56:27.221559 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hnlh" event={"ID":"1150f7d8-8918-497a-9166-869e2fd2eb04","Type":"ContainerStarted","Data":"937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237"} Oct 06 14:56:27 crc kubenswrapper[4763]: I1006 14:56:27.225108 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgd8l" event={"ID":"d6aeb0e7-db42-449d-8052-fc68154e93d2","Type":"ContainerStarted","Data":"e4b753cda7479d5ec229fd49c154486d586d1c628fb335c6591102cca99f13dc"} Oct 06 14:56:27 crc kubenswrapper[4763]: I1006 14:56:27.228084 4763 generic.go:334] "Generic (PLEG): container finished" podID="f6759086-6fd8-4989-8418-2d443118ea98" containerID="0cccfaece243deabcf291227085b8b14f726b6cbdb5be4cd1400705dc06d5798" exitCode=0 Oct 06 14:56:27 crc kubenswrapper[4763]: I1006 14:56:27.228135 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlxmh" event={"ID":"f6759086-6fd8-4989-8418-2d443118ea98","Type":"ContainerDied","Data":"0cccfaece243deabcf291227085b8b14f726b6cbdb5be4cd1400705dc06d5798"} Oct 06 14:56:27 crc kubenswrapper[4763]: I1006 14:56:27.244377 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hnlh" podStartSLOduration=2.462153701 podStartE2EDuration="32.244355558s" podCreationTimestamp="2025-10-06 14:55:55 +0000 UTC" firstStartedPulling="2025-10-06 14:55:56.833773319 +0000 UTC m=+153.989065841" lastFinishedPulling="2025-10-06 14:56:26.615975106 +0000 UTC m=+183.771267698" observedRunningTime="2025-10-06 14:56:27.242082902 +0000 UTC m=+184.397375414" watchObservedRunningTime="2025-10-06 14:56:27.244355558 +0000 UTC m=+184.399648110" Oct 06 14:56:27 crc kubenswrapper[4763]: I1006 14:56:27.284807 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hgd8l" podStartSLOduration=162.284775864 podStartE2EDuration="2m42.284775864s" podCreationTimestamp="2025-10-06 14:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:56:27.271202019 +0000 UTC m=+184.426494541" watchObservedRunningTime="2025-10-06 14:56:27.284775864 +0000 UTC m=+184.440068406" Oct 06 14:56:28 crc kubenswrapper[4763]: I1006 14:56:28.236068 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlxmh" event={"ID":"f6759086-6fd8-4989-8418-2d443118ea98","Type":"ContainerStarted","Data":"c490a56ba58b81513416ef524501d9d3a94b600b2d91fd3291b895858c8f2774"} Oct 06 14:56:28 crc kubenswrapper[4763]: I1006 14:56:28.258737 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vlxmh" podStartSLOduration=3.532396886 podStartE2EDuration="30.258719119s" podCreationTimestamp="2025-10-06 14:55:58 +0000 UTC" firstStartedPulling="2025-10-06 14:56:00.965833843 +0000 UTC m=+158.121126355" lastFinishedPulling="2025-10-06 14:56:27.692156076 +0000 UTC m=+184.847448588" observedRunningTime="2025-10-06 14:56:28.256041981 +0000 UTC m=+185.411334533" watchObservedRunningTime="2025-10-06 14:56:28.258719119 +0000 UTC m=+185.414011631" Oct 06 14:56:29 crc kubenswrapper[4763]: I1006 14:56:29.193893 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:56:29 crc kubenswrapper[4763]: I1006 14:56:29.194108 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:56:29 crc kubenswrapper[4763]: I1006 14:56:29.613378 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d9hfk" Oct 06 14:56:30 crc kubenswrapper[4763]: I1006 14:56:30.339003 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vlxmh" podUID="f6759086-6fd8-4989-8418-2d443118ea98" containerName="registry-server" probeResult="failure" output=< Oct 06 14:56:30 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Oct 06 14:56:30 crc kubenswrapper[4763]: > Oct 06 14:56:32 crc kubenswrapper[4763]: I1006 14:56:32.615974 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 14:56:33 crc kubenswrapper[4763]: I1006 14:56:33.876540 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:56:33 crc kubenswrapper[4763]: I1006 14:56:33.876587 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:56:36 crc kubenswrapper[4763]: I1006 14:56:36.208538 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:56:36 crc kubenswrapper[4763]: I1006 14:56:36.208958 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:56:36 crc kubenswrapper[4763]: I1006 14:56:36.273543 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:56:36 crc kubenswrapper[4763]: I1006 14:56:36.328243 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:56:36 crc kubenswrapper[4763]: I1006 14:56:36.501603 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hnlh"] Oct 06 14:56:38 crc kubenswrapper[4763]: I1006 14:56:38.294824 4763 generic.go:334] "Generic (PLEG): container finished" podID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerID="033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8" exitCode=0 Oct 06 14:56:38 crc kubenswrapper[4763]: I1006 14:56:38.295536 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hnlh" podUID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerName="registry-server" containerID="cri-o://937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237" gracePeriod=2 Oct 06 14:56:38 crc kubenswrapper[4763]: I1006 14:56:38.295037 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5zxt" event={"ID":"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca","Type":"ContainerDied","Data":"033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8"} Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.126231 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.209169 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-utilities\") pod \"1150f7d8-8918-497a-9166-869e2fd2eb04\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.209233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-catalog-content\") pod \"1150f7d8-8918-497a-9166-869e2fd2eb04\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.209253 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qjml\" (UniqueName: \"kubernetes.io/projected/1150f7d8-8918-497a-9166-869e2fd2eb04-kube-api-access-8qjml\") pod \"1150f7d8-8918-497a-9166-869e2fd2eb04\" (UID: \"1150f7d8-8918-497a-9166-869e2fd2eb04\") " Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.210836 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-utilities" (OuterVolumeSpecName: "utilities") pod "1150f7d8-8918-497a-9166-869e2fd2eb04" (UID: "1150f7d8-8918-497a-9166-869e2fd2eb04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.214145 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1150f7d8-8918-497a-9166-869e2fd2eb04-kube-api-access-8qjml" (OuterVolumeSpecName: "kube-api-access-8qjml") pod "1150f7d8-8918-497a-9166-869e2fd2eb04" (UID: "1150f7d8-8918-497a-9166-869e2fd2eb04"). InnerVolumeSpecName "kube-api-access-8qjml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.241754 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.253704 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1150f7d8-8918-497a-9166-869e2fd2eb04" (UID: "1150f7d8-8918-497a-9166-869e2fd2eb04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.287329 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.310481 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.310516 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1150f7d8-8918-497a-9166-869e2fd2eb04-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.310534 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qjml\" (UniqueName: \"kubernetes.io/projected/1150f7d8-8918-497a-9166-869e2fd2eb04-kube-api-access-8qjml\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.311898 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5zxt" event={"ID":"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca","Type":"ContainerStarted","Data":"7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386"} Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.313821 4763 generic.go:334] "Generic (PLEG): container finished" podID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerID="12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910" exitCode=0 Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.313924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88kng" event={"ID":"9740294c-47bf-442d-b72c-e58ee8877f3b","Type":"ContainerDied","Data":"12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910"} Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.318431 4763 generic.go:334] "Generic (PLEG): container finished" podID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerID="ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2" exitCode=0 Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.318511 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thfpq" event={"ID":"d77c108d-3366-4d93-a5a7-c59db5afaca0","Type":"ContainerDied","Data":"ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2"} Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.323205 4763 generic.go:334] "Generic (PLEG): container finished" podID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerID="937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237" exitCode=0 Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.324016 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hnlh" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.324461 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hnlh" event={"ID":"1150f7d8-8918-497a-9166-869e2fd2eb04","Type":"ContainerDied","Data":"937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237"} Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.324495 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hnlh" event={"ID":"1150f7d8-8918-497a-9166-869e2fd2eb04","Type":"ContainerDied","Data":"cbab0e5f3e1e5806a79770b6913bfa2b48d2c9e3c28130f27890923783110c08"} Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.324514 4763 scope.go:117] "RemoveContainer" containerID="937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.337779 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l5zxt" podStartSLOduration=2.46525117 podStartE2EDuration="44.337757703s" podCreationTimestamp="2025-10-06 14:55:55 +0000 UTC" firstStartedPulling="2025-10-06 14:55:56.84413679 +0000 UTC m=+153.999429322" lastFinishedPulling="2025-10-06 14:56:38.716643343 +0000 UTC m=+195.871935855" observedRunningTime="2025-10-06 14:56:39.333604202 +0000 UTC m=+196.488896714" watchObservedRunningTime="2025-10-06 14:56:39.337757703 +0000 UTC m=+196.493050215" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.345170 4763 scope.go:117] "RemoveContainer" containerID="72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.363899 4763 scope.go:117] "RemoveContainer" containerID="234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.385237 4763 scope.go:117] "RemoveContainer" containerID="937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237" Oct 06 14:56:39 crc kubenswrapper[4763]: E1006 14:56:39.389201 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237\": container with ID starting with 937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237 not found: ID does not exist" containerID="937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.389245 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237"} err="failed to get container status \"937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237\": rpc error: code = NotFound desc = could not find container \"937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237\": container with ID starting with 937aec2f8573a4179f5e01e839b907b7908b33656b890025b17f8db021f9a237 not found: ID does not exist" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.389295 4763 scope.go:117] "RemoveContainer" containerID="72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf" Oct 06 14:56:39 crc kubenswrapper[4763]: E1006 14:56:39.389802 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf\": container with ID starting with 72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf not found: ID does not exist" containerID="72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.389834 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf"} err="failed to get container status \"72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf\": rpc error: code = NotFound desc = could not find container \"72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf\": container with ID starting with 72f28962e7e3cda369c6b00276caddf0789e0e04cef897277f1a7704104e27bf not found: ID does not exist" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.389852 4763 scope.go:117] "RemoveContainer" containerID="234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2" Oct 06 14:56:39 crc kubenswrapper[4763]: E1006 14:56:39.390141 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2\": container with ID starting with 234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2 not found: ID does not exist" containerID="234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.390171 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2"} err="failed to get container status \"234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2\": rpc error: code = NotFound desc = could not find container \"234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2\": container with ID starting with 234eebca095a971628ae79be668da52fed3f51b99eb47487d9b82a32caa390e2 not found: ID does not exist" Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.391048 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hnlh"] Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.395566 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hnlh"] Oct 06 14:56:39 crc kubenswrapper[4763]: I1006 14:56:39.584443 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1150f7d8-8918-497a-9166-869e2fd2eb04" path="/var/lib/kubelet/pods/1150f7d8-8918-497a-9166-869e2fd2eb04/volumes" Oct 06 14:56:40 crc kubenswrapper[4763]: I1006 14:56:40.329936 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88kng" event={"ID":"9740294c-47bf-442d-b72c-e58ee8877f3b","Type":"ContainerStarted","Data":"10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879"} Oct 06 14:56:40 crc kubenswrapper[4763]: I1006 14:56:40.332581 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thfpq" event={"ID":"d77c108d-3366-4d93-a5a7-c59db5afaca0","Type":"ContainerStarted","Data":"19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35"} Oct 06 14:56:40 crc kubenswrapper[4763]: I1006 14:56:40.335410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zd9hq" event={"ID":"c6c90a53-6c57-4fd5-b9d7-77243c584cc4","Type":"ContainerStarted","Data":"8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8"} Oct 06 14:56:40 crc kubenswrapper[4763]: I1006 14:56:40.350770 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-88kng" podStartSLOduration=2.522481846 podStartE2EDuration="43.350757728s" podCreationTimestamp="2025-10-06 14:55:57 +0000 UTC" firstStartedPulling="2025-10-06 14:55:58.902155425 +0000 UTC m=+156.057447937" lastFinishedPulling="2025-10-06 14:56:39.730431307 +0000 UTC m=+196.885723819" observedRunningTime="2025-10-06 14:56:40.350119391 +0000 UTC m=+197.505411903" watchObservedRunningTime="2025-10-06 14:56:40.350757728 +0000 UTC m=+197.506050240" Oct 06 14:56:40 crc kubenswrapper[4763]: I1006 14:56:40.386158 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-thfpq" podStartSLOduration=2.426309201 podStartE2EDuration="45.386136597s" podCreationTimestamp="2025-10-06 14:55:55 +0000 UTC" firstStartedPulling="2025-10-06 14:55:56.830782352 +0000 UTC m=+153.986074864" lastFinishedPulling="2025-10-06 14:56:39.790609748 +0000 UTC m=+196.945902260" observedRunningTime="2025-10-06 14:56:40.370526473 +0000 UTC m=+197.525819005" watchObservedRunningTime="2025-10-06 14:56:40.386136597 +0000 UTC m=+197.541429109" Oct 06 14:56:41 crc kubenswrapper[4763]: I1006 14:56:41.341949 4763 generic.go:334] "Generic (PLEG): container finished" podID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerID="0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c" exitCode=0 Oct 06 14:56:41 crc kubenswrapper[4763]: I1006 14:56:41.342126 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nczcd" event={"ID":"26eaff77-0592-4caa-8fa9-f2ad86445c41","Type":"ContainerDied","Data":"0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c"} Oct 06 14:56:41 crc kubenswrapper[4763]: I1006 14:56:41.344184 4763 generic.go:334] "Generic (PLEG): container finished" podID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerID="8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8" exitCode=0 Oct 06 14:56:41 crc kubenswrapper[4763]: I1006 14:56:41.344208 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zd9hq" event={"ID":"c6c90a53-6c57-4fd5-b9d7-77243c584cc4","Type":"ContainerDied","Data":"8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8"} Oct 06 14:56:41 crc kubenswrapper[4763]: I1006 14:56:41.344229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zd9hq" event={"ID":"c6c90a53-6c57-4fd5-b9d7-77243c584cc4","Type":"ContainerStarted","Data":"3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609"} Oct 06 14:56:41 crc kubenswrapper[4763]: I1006 14:56:41.375191 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zd9hq" podStartSLOduration=2.473058426 podStartE2EDuration="44.375170259s" podCreationTimestamp="2025-10-06 14:55:57 +0000 UTC" firstStartedPulling="2025-10-06 14:55:58.89889594 +0000 UTC m=+156.054188452" lastFinishedPulling="2025-10-06 14:56:40.801007773 +0000 UTC m=+197.956300285" observedRunningTime="2025-10-06 14:56:41.371528593 +0000 UTC m=+198.526821115" watchObservedRunningTime="2025-10-06 14:56:41.375170259 +0000 UTC m=+198.530462771" Oct 06 14:56:42 crc kubenswrapper[4763]: I1006 14:56:42.351186 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nczcd" event={"ID":"26eaff77-0592-4caa-8fa9-f2ad86445c41","Type":"ContainerStarted","Data":"eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5"} Oct 06 14:56:42 crc kubenswrapper[4763]: I1006 14:56:42.371677 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nczcd" podStartSLOduration=3.339380864 podStartE2EDuration="44.371653829s" podCreationTimestamp="2025-10-06 14:55:58 +0000 UTC" firstStartedPulling="2025-10-06 14:56:00.962950719 +0000 UTC m=+158.118243231" lastFinishedPulling="2025-10-06 14:56:41.995223664 +0000 UTC m=+199.150516196" observedRunningTime="2025-10-06 14:56:42.368867815 +0000 UTC m=+199.524160327" watchObservedRunningTime="2025-10-06 14:56:42.371653829 +0000 UTC m=+199.526946331" Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.104806 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vlxmh"] Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.105418 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vlxmh" podUID="f6759086-6fd8-4989-8418-2d443118ea98" containerName="registry-server" containerID="cri-o://c490a56ba58b81513416ef524501d9d3a94b600b2d91fd3291b895858c8f2774" gracePeriod=2 Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.358837 4763 generic.go:334] "Generic (PLEG): container finished" podID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerID="66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec" exitCode=0 Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.358896 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cf9s" event={"ID":"d61abe4d-a32e-4520-a8a5-bb9f111a7c28","Type":"ContainerDied","Data":"66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec"} Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.364369 4763 generic.go:334] "Generic (PLEG): container finished" podID="f6759086-6fd8-4989-8418-2d443118ea98" containerID="c490a56ba58b81513416ef524501d9d3a94b600b2d91fd3291b895858c8f2774" exitCode=0 Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.364397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlxmh" event={"ID":"f6759086-6fd8-4989-8418-2d443118ea98","Type":"ContainerDied","Data":"c490a56ba58b81513416ef524501d9d3a94b600b2d91fd3291b895858c8f2774"} Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.456544 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.555564 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-utilities\") pod \"f6759086-6fd8-4989-8418-2d443118ea98\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.555657 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xwgk\" (UniqueName: \"kubernetes.io/projected/f6759086-6fd8-4989-8418-2d443118ea98-kube-api-access-6xwgk\") pod \"f6759086-6fd8-4989-8418-2d443118ea98\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.555757 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-catalog-content\") pod \"f6759086-6fd8-4989-8418-2d443118ea98\" (UID: \"f6759086-6fd8-4989-8418-2d443118ea98\") " Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.556585 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-utilities" (OuterVolumeSpecName: "utilities") pod "f6759086-6fd8-4989-8418-2d443118ea98" (UID: "f6759086-6fd8-4989-8418-2d443118ea98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.562145 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6759086-6fd8-4989-8418-2d443118ea98-kube-api-access-6xwgk" (OuterVolumeSpecName: "kube-api-access-6xwgk") pod "f6759086-6fd8-4989-8418-2d443118ea98" (UID: "f6759086-6fd8-4989-8418-2d443118ea98"). InnerVolumeSpecName "kube-api-access-6xwgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.651662 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6759086-6fd8-4989-8418-2d443118ea98" (UID: "f6759086-6fd8-4989-8418-2d443118ea98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.656600 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.656641 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6759086-6fd8-4989-8418-2d443118ea98-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:43 crc kubenswrapper[4763]: I1006 14:56:43.656655 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xwgk\" (UniqueName: \"kubernetes.io/projected/f6759086-6fd8-4989-8418-2d443118ea98-kube-api-access-6xwgk\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:44 crc kubenswrapper[4763]: I1006 14:56:44.372283 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlxmh" event={"ID":"f6759086-6fd8-4989-8418-2d443118ea98","Type":"ContainerDied","Data":"4a4b858487d9dd0aae15b68adde6668975df176fbf91a24ff0c0cac4e39b9a00"} Oct 06 14:56:44 crc kubenswrapper[4763]: I1006 14:56:44.372555 4763 scope.go:117] "RemoveContainer" containerID="c490a56ba58b81513416ef524501d9d3a94b600b2d91fd3291b895858c8f2774" Oct 06 14:56:44 crc kubenswrapper[4763]: I1006 14:56:44.372356 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlxmh" Oct 06 14:56:44 crc kubenswrapper[4763]: I1006 14:56:44.377102 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cf9s" event={"ID":"d61abe4d-a32e-4520-a8a5-bb9f111a7c28","Type":"ContainerStarted","Data":"d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0"} Oct 06 14:56:44 crc kubenswrapper[4763]: I1006 14:56:44.388408 4763 scope.go:117] "RemoveContainer" containerID="0cccfaece243deabcf291227085b8b14f726b6cbdb5be4cd1400705dc06d5798" Oct 06 14:56:44 crc kubenswrapper[4763]: I1006 14:56:44.403116 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7cf9s" podStartSLOduration=2.4695256 podStartE2EDuration="49.40310039s" podCreationTimestamp="2025-10-06 14:55:55 +0000 UTC" firstStartedPulling="2025-10-06 14:55:56.853245955 +0000 UTC m=+154.008538467" lastFinishedPulling="2025-10-06 14:56:43.786820755 +0000 UTC m=+200.942113257" observedRunningTime="2025-10-06 14:56:44.398123047 +0000 UTC m=+201.553415559" watchObservedRunningTime="2025-10-06 14:56:44.40310039 +0000 UTC m=+201.558392902" Oct 06 14:56:44 crc kubenswrapper[4763]: I1006 14:56:44.415371 4763 scope.go:117] "RemoveContainer" containerID="c27700bc4c9a51ef9aaa0614e76ffb0968a3093d1fdaca3e79da02286d53752e" Oct 06 14:56:44 crc kubenswrapper[4763]: I1006 14:56:44.416140 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vlxmh"] Oct 06 14:56:44 crc kubenswrapper[4763]: I1006 14:56:44.419271 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vlxmh"] Oct 06 14:56:45 crc kubenswrapper[4763]: I1006 14:56:45.583312 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6759086-6fd8-4989-8418-2d443118ea98" path="/var/lib/kubelet/pods/f6759086-6fd8-4989-8418-2d443118ea98/volumes" Oct 06 14:56:45 crc kubenswrapper[4763]: I1006 14:56:45.591254 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:56:45 crc kubenswrapper[4763]: I1006 14:56:45.591292 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:56:45 crc kubenswrapper[4763]: I1006 14:56:45.643571 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:56:45 crc kubenswrapper[4763]: I1006 14:56:45.776861 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:56:45 crc kubenswrapper[4763]: I1006 14:56:45.779776 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:56:45 crc kubenswrapper[4763]: I1006 14:56:45.826148 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:56:46 crc kubenswrapper[4763]: I1006 14:56:46.034959 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:56:46 crc kubenswrapper[4763]: I1006 14:56:46.035011 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:56:46 crc kubenswrapper[4763]: I1006 14:56:46.083084 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:56:46 crc kubenswrapper[4763]: I1006 14:56:46.427497 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:56:46 crc kubenswrapper[4763]: I1006 14:56:46.432032 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:56:47 crc kubenswrapper[4763]: I1006 14:56:47.793747 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:56:47 crc kubenswrapper[4763]: I1006 14:56:47.794096 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:56:47 crc kubenswrapper[4763]: I1006 14:56:47.855859 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:56:48 crc kubenswrapper[4763]: I1006 14:56:48.264696 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:56:48 crc kubenswrapper[4763]: I1006 14:56:48.264977 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:56:48 crc kubenswrapper[4763]: I1006 14:56:48.304508 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:56:48 crc kubenswrapper[4763]: I1006 14:56:48.443570 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:56:48 crc kubenswrapper[4763]: I1006 14:56:48.452101 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:56:48 crc kubenswrapper[4763]: I1006 14:56:48.836781 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:56:48 crc kubenswrapper[4763]: I1006 14:56:48.836905 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:56:48 crc kubenswrapper[4763]: I1006 14:56:48.879240 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:56:48 crc kubenswrapper[4763]: I1006 14:56:48.899466 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-thfpq"] Oct 06 14:56:48 crc kubenswrapper[4763]: I1006 14:56:48.900392 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-thfpq" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerName="registry-server" containerID="cri-o://19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35" gracePeriod=2 Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.212254 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.221289 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-catalog-content\") pod \"d77c108d-3366-4d93-a5a7-c59db5afaca0\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.221335 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-utilities\") pod \"d77c108d-3366-4d93-a5a7-c59db5afaca0\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.221375 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mps8g\" (UniqueName: \"kubernetes.io/projected/d77c108d-3366-4d93-a5a7-c59db5afaca0-kube-api-access-mps8g\") pod \"d77c108d-3366-4d93-a5a7-c59db5afaca0\" (UID: \"d77c108d-3366-4d93-a5a7-c59db5afaca0\") " Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.222696 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-utilities" (OuterVolumeSpecName: "utilities") pod "d77c108d-3366-4d93-a5a7-c59db5afaca0" (UID: "d77c108d-3366-4d93-a5a7-c59db5afaca0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.239551 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77c108d-3366-4d93-a5a7-c59db5afaca0-kube-api-access-mps8g" (OuterVolumeSpecName: "kube-api-access-mps8g") pod "d77c108d-3366-4d93-a5a7-c59db5afaca0" (UID: "d77c108d-3366-4d93-a5a7-c59db5afaca0"). InnerVolumeSpecName "kube-api-access-mps8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.277449 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d77c108d-3366-4d93-a5a7-c59db5afaca0" (UID: "d77c108d-3366-4d93-a5a7-c59db5afaca0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.322797 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.322836 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mps8g\" (UniqueName: \"kubernetes.io/projected/d77c108d-3366-4d93-a5a7-c59db5afaca0-kube-api-access-mps8g\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.322850 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77c108d-3366-4d93-a5a7-c59db5afaca0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.408515 4763 generic.go:334] "Generic (PLEG): container finished" podID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerID="19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35" exitCode=0 Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.408604 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thfpq" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.408683 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thfpq" event={"ID":"d77c108d-3366-4d93-a5a7-c59db5afaca0","Type":"ContainerDied","Data":"19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35"} Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.408736 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thfpq" event={"ID":"d77c108d-3366-4d93-a5a7-c59db5afaca0","Type":"ContainerDied","Data":"17e534aceb03545c0998cf333dde7e9b55c1391218fee93287e1bb69e393a909"} Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.408771 4763 scope.go:117] "RemoveContainer" containerID="19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.429072 4763 scope.go:117] "RemoveContainer" containerID="ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.450993 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.451252 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-thfpq"] Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.453842 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-thfpq"] Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.461594 4763 scope.go:117] "RemoveContainer" containerID="fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.475584 4763 scope.go:117] "RemoveContainer" containerID="19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35" Oct 06 14:56:49 crc kubenswrapper[4763]: E1006 14:56:49.476066 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35\": container with ID starting with 19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35 not found: ID does not exist" containerID="19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.476100 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35"} err="failed to get container status \"19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35\": rpc error: code = NotFound desc = could not find container \"19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35\": container with ID starting with 19f489b044b4b75572925f0a29677b443832ac915f703bfcb5c2aaffd9e95a35 not found: ID does not exist" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.476127 4763 scope.go:117] "RemoveContainer" containerID="ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2" Oct 06 14:56:49 crc kubenswrapper[4763]: E1006 14:56:49.476503 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2\": container with ID starting with ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2 not found: ID does not exist" containerID="ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.477428 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2"} err="failed to get container status \"ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2\": rpc error: code = NotFound desc = could not find container \"ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2\": container with ID starting with ec9a7f0b75420cebd491e18d71b85a694a7d6e0bff919d450c1f2d290aac88a2 not found: ID does not exist" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.477460 4763 scope.go:117] "RemoveContainer" containerID="fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712" Oct 06 14:56:49 crc kubenswrapper[4763]: E1006 14:56:49.477749 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712\": container with ID starting with fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712 not found: ID does not exist" containerID="fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.477777 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712"} err="failed to get container status \"fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712\": rpc error: code = NotFound desc = could not find container \"fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712\": container with ID starting with fa072633821e96dab89e6b77fb527e0919a629bd115206143dc76b0a3fbab712 not found: ID does not exist" Oct 06 14:56:49 crc kubenswrapper[4763]: I1006 14:56:49.580952 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" path="/var/lib/kubelet/pods/d77c108d-3366-4d93-a5a7-c59db5afaca0/volumes" Oct 06 14:56:51 crc kubenswrapper[4763]: I1006 14:56:51.300516 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88kng"] Oct 06 14:56:51 crc kubenswrapper[4763]: I1006 14:56:51.422603 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-88kng" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerName="registry-server" containerID="cri-o://10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879" gracePeriod=2 Oct 06 14:56:51 crc kubenswrapper[4763]: I1006 14:56:51.772300 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:56:51 crc kubenswrapper[4763]: I1006 14:56:51.949725 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2bsn\" (UniqueName: \"kubernetes.io/projected/9740294c-47bf-442d-b72c-e58ee8877f3b-kube-api-access-m2bsn\") pod \"9740294c-47bf-442d-b72c-e58ee8877f3b\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " Oct 06 14:56:51 crc kubenswrapper[4763]: I1006 14:56:51.949849 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-utilities\") pod \"9740294c-47bf-442d-b72c-e58ee8877f3b\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " Oct 06 14:56:51 crc kubenswrapper[4763]: I1006 14:56:51.949924 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-catalog-content\") pod \"9740294c-47bf-442d-b72c-e58ee8877f3b\" (UID: \"9740294c-47bf-442d-b72c-e58ee8877f3b\") " Oct 06 14:56:51 crc kubenswrapper[4763]: I1006 14:56:51.950555 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-utilities" (OuterVolumeSpecName: "utilities") pod "9740294c-47bf-442d-b72c-e58ee8877f3b" (UID: "9740294c-47bf-442d-b72c-e58ee8877f3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:56:51 crc kubenswrapper[4763]: I1006 14:56:51.955061 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9740294c-47bf-442d-b72c-e58ee8877f3b-kube-api-access-m2bsn" (OuterVolumeSpecName: "kube-api-access-m2bsn") pod "9740294c-47bf-442d-b72c-e58ee8877f3b" (UID: "9740294c-47bf-442d-b72c-e58ee8877f3b"). InnerVolumeSpecName "kube-api-access-m2bsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:56:51 crc kubenswrapper[4763]: I1006 14:56:51.964732 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9740294c-47bf-442d-b72c-e58ee8877f3b" (UID: "9740294c-47bf-442d-b72c-e58ee8877f3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.051595 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.051683 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9740294c-47bf-442d-b72c-e58ee8877f3b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.051695 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2bsn\" (UniqueName: \"kubernetes.io/projected/9740294c-47bf-442d-b72c-e58ee8877f3b-kube-api-access-m2bsn\") on node \"crc\" DevicePath \"\"" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.431017 4763 generic.go:334] "Generic (PLEG): container finished" podID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerID="10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879" exitCode=0 Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.431077 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88kng" event={"ID":"9740294c-47bf-442d-b72c-e58ee8877f3b","Type":"ContainerDied","Data":"10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879"} Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.431096 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88kng" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.431116 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88kng" event={"ID":"9740294c-47bf-442d-b72c-e58ee8877f3b","Type":"ContainerDied","Data":"126fd65f897b2ce35051b58f55536eb98afe6da412f4dcf64b28770e3f5da010"} Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.431144 4763 scope.go:117] "RemoveContainer" containerID="10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.446779 4763 scope.go:117] "RemoveContainer" containerID="12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.462971 4763 scope.go:117] "RemoveContainer" containerID="a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.465409 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88kng"] Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.469106 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-88kng"] Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.488449 4763 scope.go:117] "RemoveContainer" containerID="10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879" Oct 06 14:56:52 crc kubenswrapper[4763]: E1006 14:56:52.489005 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879\": container with ID starting with 10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879 not found: ID does not exist" containerID="10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.489049 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879"} err="failed to get container status \"10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879\": rpc error: code = NotFound desc = could not find container \"10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879\": container with ID starting with 10ebfefbf9e6f4817915e25648ed60d8ac92540a4a1a64f90177ca546beb9879 not found: ID does not exist" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.489078 4763 scope.go:117] "RemoveContainer" containerID="12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910" Oct 06 14:56:52 crc kubenswrapper[4763]: E1006 14:56:52.489488 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910\": container with ID starting with 12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910 not found: ID does not exist" containerID="12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.489524 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910"} err="failed to get container status \"12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910\": rpc error: code = NotFound desc = could not find container \"12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910\": container with ID starting with 12e67d48cd5f049ec9e14328769c79b0a1b15927ba09cbeea00b18078d6fd910 not found: ID does not exist" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.489547 4763 scope.go:117] "RemoveContainer" containerID="a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c" Oct 06 14:56:52 crc kubenswrapper[4763]: E1006 14:56:52.489886 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c\": container with ID starting with a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c not found: ID does not exist" containerID="a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c" Oct 06 14:56:52 crc kubenswrapper[4763]: I1006 14:56:52.489948 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c"} err="failed to get container status \"a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c\": rpc error: code = NotFound desc = could not find container \"a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c\": container with ID starting with a6430273eabf69132a5fddab9d771a2fb9c485dcfea296368a4ae7c1da93672c not found: ID does not exist" Oct 06 14:56:53 crc kubenswrapper[4763]: I1006 14:56:53.585920 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" path="/var/lib/kubelet/pods/9740294c-47bf-442d-b72c-e58ee8877f3b/volumes" Oct 06 14:56:55 crc kubenswrapper[4763]: I1006 14:56:55.833937 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:56:57 crc kubenswrapper[4763]: I1006 14:56:57.743446 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9nc4c"] Oct 06 14:57:03 crc kubenswrapper[4763]: I1006 14:57:03.877198 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:57:03 crc kubenswrapper[4763]: I1006 14:57:03.877603 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:57:03 crc kubenswrapper[4763]: I1006 14:57:03.877720 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 14:57:03 crc kubenswrapper[4763]: I1006 14:57:03.878750 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:57:03 crc kubenswrapper[4763]: I1006 14:57:03.878898 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f" gracePeriod=600 Oct 06 14:57:04 crc kubenswrapper[4763]: I1006 14:57:04.501574 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f" exitCode=0 Oct 06 14:57:04 crc kubenswrapper[4763]: I1006 14:57:04.501833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f"} Oct 06 14:57:04 crc kubenswrapper[4763]: I1006 14:57:04.501952 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"ec7df7ad7c0a76d74f3aca8ef5c84926ad1838d5dc2831784880a51f821dd0d5"} Oct 06 14:57:22 crc kubenswrapper[4763]: I1006 14:57:22.785582 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" podUID="8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" containerName="oauth-openshift" containerID="cri-o://2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d" gracePeriod=15 Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.223982 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.274289 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-56d874b86c-dsqd7"] Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.275314 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26213b6-9e33-4bd3-9a9c-cd07b049cf17" containerName="pruner" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.275560 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26213b6-9e33-4bd3-9a9c-cd07b049cf17" containerName="pruner" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.275823 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerName="extract-utilities" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.276031 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerName="extract-utilities" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.276213 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerName="extract-content" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.276366 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerName="extract-content" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.276531 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6759086-6fd8-4989-8418-2d443118ea98" containerName="extract-content" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.276729 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6759086-6fd8-4989-8418-2d443118ea98" containerName="extract-content" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.277002 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6759086-6fd8-4989-8418-2d443118ea98" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.277181 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6759086-6fd8-4989-8418-2d443118ea98" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.280742 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd53d5e-ae7a-49ac-a28f-ae041c628a4b" containerName="pruner" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.280796 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd53d5e-ae7a-49ac-a28f-ae041c628a4b" containerName="pruner" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.280848 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerName="extract-utilities" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.280865 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerName="extract-utilities" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.280886 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6759086-6fd8-4989-8418-2d443118ea98" containerName="extract-utilities" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.280901 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6759086-6fd8-4989-8418-2d443118ea98" containerName="extract-utilities" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.280918 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerName="extract-utilities" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.280933 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerName="extract-utilities" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.280959 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerName="extract-content" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.280977 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerName="extract-content" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.281003 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281017 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.281035 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281049 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.281072 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" containerName="oauth-openshift" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281087 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" containerName="oauth-openshift" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.281111 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerName="extract-content" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281127 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerName="extract-content" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.281153 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281168 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281442 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9740294c-47bf-442d-b72c-e58ee8877f3b" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281471 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6759086-6fd8-4989-8418-2d443118ea98" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281494 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd53d5e-ae7a-49ac-a28f-ae041c628a4b" containerName="pruner" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281517 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" containerName="oauth-openshift" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281547 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1150f7d8-8918-497a-9166-869e2fd2eb04" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281589 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26213b6-9e33-4bd3-9a9c-cd07b049cf17" containerName="pruner" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.281645 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77c108d-3366-4d93-a5a7-c59db5afaca0" containerName="registry-server" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.282506 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.287322 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-56d874b86c-dsqd7"] Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.376498 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-session\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.377179 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-trusted-ca-bundle\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.377383 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-provider-selection\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.377735 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-idp-0-file-data\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379006 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-router-certs\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379265 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-dir\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.378390 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379475 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-serving-cert\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379586 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-ocp-branding-template\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379373 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379697 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-login\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379751 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhqvz\" (UniqueName: \"kubernetes.io/projected/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-kube-api-access-fhqvz\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379838 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-error\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379901 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-cliconfig\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379948 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-policies\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.379987 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-service-ca\") pod \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\" (UID: \"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d\") " Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.380164 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs2g4\" (UniqueName: \"kubernetes.io/projected/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-kube-api-access-gs2g4\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.380208 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-template-login\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.380265 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-router-certs\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.380299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-audit-dir\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.380337 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.380393 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.380448 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.380653 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.380930 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-audit-policies\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381099 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-template-error\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381111 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381154 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381210 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-service-ca\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381245 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381268 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381358 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-session\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381480 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381516 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381547 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381579 4763 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.381775 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.385504 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.386143 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.386254 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.385883 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-kube-api-access-fhqvz" (OuterVolumeSpecName: "kube-api-access-fhqvz") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "kube-api-access-fhqvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.387351 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.389224 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.389612 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.389884 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.390351 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" (UID: "8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483165 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-template-error\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483227 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483260 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-service-ca\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483294 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483331 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-session\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483380 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2g4\" (UniqueName: \"kubernetes.io/projected/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-kube-api-access-gs2g4\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483413 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-template-login\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483471 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-router-certs\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483499 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-audit-dir\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483529 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483571 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483697 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483732 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-audit-policies\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483791 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhqvz\" (UniqueName: \"kubernetes.io/projected/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-kube-api-access-fhqvz\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483814 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483834 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483852 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483871 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483890 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483909 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483927 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483947 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.483966 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.484932 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-audit-policies\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.485529 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-audit-dir\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.485950 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-service-ca\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.486662 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.488483 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.488753 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-template-error\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.489242 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-router-certs\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.489695 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.490568 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.491185 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-template-login\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.491222 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.493263 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.495171 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-v4-0-config-system-session\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.511137 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs2g4\" (UniqueName: \"kubernetes.io/projected/b48f3b1e-a51e-44b9-b8a4-c5112db8568e-kube-api-access-gs2g4\") pod \"oauth-openshift-56d874b86c-dsqd7\" (UID: \"b48f3b1e-a51e-44b9-b8a4-c5112db8568e\") " pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.605646 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.615781 4763 generic.go:334] "Generic (PLEG): container finished" podID="8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" containerID="2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d" exitCode=0 Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.615879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" event={"ID":"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d","Type":"ContainerDied","Data":"2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d"} Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.615925 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" event={"ID":"8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d","Type":"ContainerDied","Data":"fe9f6a743b6c34135d2fed33fbf4adf524527f191522f5bc0a303f22887196ab"} Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.615958 4763 scope.go:117] "RemoveContainer" containerID="2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.616136 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9nc4c" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.642662 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9nc4c"] Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.650288 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9nc4c"] Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.671813 4763 scope.go:117] "RemoveContainer" containerID="2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d" Oct 06 14:57:23 crc kubenswrapper[4763]: E1006 14:57:23.672461 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d\": container with ID starting with 2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d not found: ID does not exist" containerID="2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.672550 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d"} err="failed to get container status \"2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d\": rpc error: code = NotFound desc = could not find container \"2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d\": container with ID starting with 2a24a026c031ccd4f834f689752fe406ff95f099ab5eaa7fcf4e69fca5f0dd1d not found: ID does not exist" Oct 06 14:57:23 crc kubenswrapper[4763]: I1006 14:57:23.913216 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-56d874b86c-dsqd7"] Oct 06 14:57:23 crc kubenswrapper[4763]: W1006 14:57:23.926133 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb48f3b1e_a51e_44b9_b8a4_c5112db8568e.slice/crio-16efdc75490d5f9a379a8c502be646c6ec46b093231ba7a6b2f2f0bacd15db47 WatchSource:0}: Error finding container 16efdc75490d5f9a379a8c502be646c6ec46b093231ba7a6b2f2f0bacd15db47: Status 404 returned error can't find the container with id 16efdc75490d5f9a379a8c502be646c6ec46b093231ba7a6b2f2f0bacd15db47 Oct 06 14:57:24 crc kubenswrapper[4763]: I1006 14:57:24.632106 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" event={"ID":"b48f3b1e-a51e-44b9-b8a4-c5112db8568e","Type":"ContainerStarted","Data":"ea2ef9f53f7404201b0b42649f032adfc3740c112918737005c0c732c8d215a6"} Oct 06 14:57:24 crc kubenswrapper[4763]: I1006 14:57:24.632438 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" event={"ID":"b48f3b1e-a51e-44b9-b8a4-c5112db8568e","Type":"ContainerStarted","Data":"16efdc75490d5f9a379a8c502be646c6ec46b093231ba7a6b2f2f0bacd15db47"} Oct 06 14:57:24 crc kubenswrapper[4763]: I1006 14:57:24.632462 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:24 crc kubenswrapper[4763]: I1006 14:57:24.658744 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" podStartSLOduration=27.65872491 podStartE2EDuration="27.65872491s" podCreationTimestamp="2025-10-06 14:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:57:24.654708023 +0000 UTC m=+241.810000545" watchObservedRunningTime="2025-10-06 14:57:24.65872491 +0000 UTC m=+241.814017462" Oct 06 14:57:24 crc kubenswrapper[4763]: I1006 14:57:24.983654 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-56d874b86c-dsqd7" Oct 06 14:57:25 crc kubenswrapper[4763]: I1006 14:57:25.582819 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d" path="/var/lib/kubelet/pods/8f01c1e6-2fd7-4ca0-8d62-2b79c82e402d/volumes" Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.919124 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7cf9s"] Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.919901 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7cf9s" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerName="registry-server" containerID="cri-o://d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0" gracePeriod=30 Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.933515 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l5zxt"] Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.933952 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l5zxt" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerName="registry-server" containerID="cri-o://7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386" gracePeriod=30 Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.947841 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmrz8"] Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.948418 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" podUID="ef7f0e05-29c2-4890-8d13-0466593e1fa8" containerName="marketplace-operator" containerID="cri-o://c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be" gracePeriod=30 Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.974826 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zd9hq"] Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.975063 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zd9hq" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerName="registry-server" containerID="cri-o://3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609" gracePeriod=30 Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.986037 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dwnx6"] Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.988561 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.994038 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nczcd"] Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.994305 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nczcd" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerName="registry-server" containerID="cri-o://eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5" gracePeriod=30 Oct 06 14:57:39 crc kubenswrapper[4763]: I1006 14:57:39.997457 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dwnx6"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.128984 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfcf8a62-1f4d-4af8-8bec-5eabdba0df59-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dwnx6\" (UID: \"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.129344 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6grkb\" (UniqueName: \"kubernetes.io/projected/dfcf8a62-1f4d-4af8-8bec-5eabdba0df59-kube-api-access-6grkb\") pod \"marketplace-operator-79b997595-dwnx6\" (UID: \"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.129380 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfcf8a62-1f4d-4af8-8bec-5eabdba0df59-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dwnx6\" (UID: \"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.230468 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfcf8a62-1f4d-4af8-8bec-5eabdba0df59-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dwnx6\" (UID: \"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.230540 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6grkb\" (UniqueName: \"kubernetes.io/projected/dfcf8a62-1f4d-4af8-8bec-5eabdba0df59-kube-api-access-6grkb\") pod \"marketplace-operator-79b997595-dwnx6\" (UID: \"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.230564 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfcf8a62-1f4d-4af8-8bec-5eabdba0df59-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dwnx6\" (UID: \"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.232907 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfcf8a62-1f4d-4af8-8bec-5eabdba0df59-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dwnx6\" (UID: \"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.235979 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfcf8a62-1f4d-4af8-8bec-5eabdba0df59-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dwnx6\" (UID: \"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.246727 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6grkb\" (UniqueName: \"kubernetes.io/projected/dfcf8a62-1f4d-4af8-8bec-5eabdba0df59-kube-api-access-6grkb\") pod \"marketplace-operator-79b997595-dwnx6\" (UID: \"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.362020 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.377887 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.381489 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.386085 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.404322 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.418004 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545716 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-utilities\") pod \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545771 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zh9w\" (UniqueName: \"kubernetes.io/projected/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-kube-api-access-6zh9w\") pod \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545792 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-utilities\") pod \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545808 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8g8h\" (UniqueName: \"kubernetes.io/projected/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-kube-api-access-c8g8h\") pod \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545837 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wfzl\" (UniqueName: \"kubernetes.io/projected/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-kube-api-access-2wfzl\") pod \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545865 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-catalog-content\") pod \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\" (UID: \"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545883 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75xgn\" (UniqueName: \"kubernetes.io/projected/ef7f0e05-29c2-4890-8d13-0466593e1fa8-kube-api-access-75xgn\") pod \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545904 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-catalog-content\") pod \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545926 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-operator-metrics\") pod \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545943 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-utilities\") pod \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\" (UID: \"c6c90a53-6c57-4fd5-b9d7-77243c584cc4\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545964 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-catalog-content\") pod \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\" (UID: \"d61abe4d-a32e-4520-a8a5-bb9f111a7c28\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.545978 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-catalog-content\") pod \"26eaff77-0592-4caa-8fa9-f2ad86445c41\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.546004 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-trusted-ca\") pod \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\" (UID: \"ef7f0e05-29c2-4890-8d13-0466593e1fa8\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.546027 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-utilities\") pod \"26eaff77-0592-4caa-8fa9-f2ad86445c41\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.546053 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smqhw\" (UniqueName: \"kubernetes.io/projected/26eaff77-0592-4caa-8fa9-f2ad86445c41-kube-api-access-smqhw\") pod \"26eaff77-0592-4caa-8fa9-f2ad86445c41\" (UID: \"26eaff77-0592-4caa-8fa9-f2ad86445c41\") " Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.546654 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-utilities" (OuterVolumeSpecName: "utilities") pod "e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" (UID: "e364490a-9e0b-45fb-bbf3-f6c603f7e9ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.546715 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-utilities" (OuterVolumeSpecName: "utilities") pod "d61abe4d-a32e-4520-a8a5-bb9f111a7c28" (UID: "d61abe4d-a32e-4520-a8a5-bb9f111a7c28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.547208 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ef7f0e05-29c2-4890-8d13-0466593e1fa8" (UID: "ef7f0e05-29c2-4890-8d13-0466593e1fa8"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.547439 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-utilities" (OuterVolumeSpecName: "utilities") pod "c6c90a53-6c57-4fd5-b9d7-77243c584cc4" (UID: "c6c90a53-6c57-4fd5-b9d7-77243c584cc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.547915 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-utilities" (OuterVolumeSpecName: "utilities") pod "26eaff77-0592-4caa-8fa9-f2ad86445c41" (UID: "26eaff77-0592-4caa-8fa9-f2ad86445c41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.549456 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-kube-api-access-6zh9w" (OuterVolumeSpecName: "kube-api-access-6zh9w") pod "d61abe4d-a32e-4520-a8a5-bb9f111a7c28" (UID: "d61abe4d-a32e-4520-a8a5-bb9f111a7c28"). InnerVolumeSpecName "kube-api-access-6zh9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.549485 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-kube-api-access-c8g8h" (OuterVolumeSpecName: "kube-api-access-c8g8h") pod "c6c90a53-6c57-4fd5-b9d7-77243c584cc4" (UID: "c6c90a53-6c57-4fd5-b9d7-77243c584cc4"). InnerVolumeSpecName "kube-api-access-c8g8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.549585 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26eaff77-0592-4caa-8fa9-f2ad86445c41-kube-api-access-smqhw" (OuterVolumeSpecName: "kube-api-access-smqhw") pod "26eaff77-0592-4caa-8fa9-f2ad86445c41" (UID: "26eaff77-0592-4caa-8fa9-f2ad86445c41"). InnerVolumeSpecName "kube-api-access-smqhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.551753 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-kube-api-access-2wfzl" (OuterVolumeSpecName: "kube-api-access-2wfzl") pod "e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" (UID: "e364490a-9e0b-45fb-bbf3-f6c603f7e9ca"). InnerVolumeSpecName "kube-api-access-2wfzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.553897 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7f0e05-29c2-4890-8d13-0466593e1fa8-kube-api-access-75xgn" (OuterVolumeSpecName: "kube-api-access-75xgn") pod "ef7f0e05-29c2-4890-8d13-0466593e1fa8" (UID: "ef7f0e05-29c2-4890-8d13-0466593e1fa8"). InnerVolumeSpecName "kube-api-access-75xgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.557840 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ef7f0e05-29c2-4890-8d13-0466593e1fa8" (UID: "ef7f0e05-29c2-4890-8d13-0466593e1fa8"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.567387 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dwnx6"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.575748 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6c90a53-6c57-4fd5-b9d7-77243c584cc4" (UID: "c6c90a53-6c57-4fd5-b9d7-77243c584cc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.601457 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" (UID: "e364490a-9e0b-45fb-bbf3-f6c603f7e9ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.606109 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d61abe4d-a32e-4520-a8a5-bb9f111a7c28" (UID: "d61abe4d-a32e-4520-a8a5-bb9f111a7c28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.651166 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26eaff77-0592-4caa-8fa9-f2ad86445c41" (UID: "26eaff77-0592-4caa-8fa9-f2ad86445c41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652206 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652262 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75xgn\" (UniqueName: \"kubernetes.io/projected/ef7f0e05-29c2-4890-8d13-0466593e1fa8-kube-api-access-75xgn\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652278 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652288 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652298 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652306 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652314 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652323 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef7f0e05-29c2-4890-8d13-0466593e1fa8-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652333 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26eaff77-0592-4caa-8fa9-f2ad86445c41-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652342 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smqhw\" (UniqueName: \"kubernetes.io/projected/26eaff77-0592-4caa-8fa9-f2ad86445c41-kube-api-access-smqhw\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652350 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652360 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zh9w\" (UniqueName: \"kubernetes.io/projected/d61abe4d-a32e-4520-a8a5-bb9f111a7c28-kube-api-access-6zh9w\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652367 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652375 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8g8h\" (UniqueName: \"kubernetes.io/projected/c6c90a53-6c57-4fd5-b9d7-77243c584cc4-kube-api-access-c8g8h\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.652383 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wfzl\" (UniqueName: \"kubernetes.io/projected/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca-kube-api-access-2wfzl\") on node \"crc\" DevicePath \"\"" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.754160 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" event={"ID":"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59","Type":"ContainerStarted","Data":"a4142f501971dadcd9d7b0bb6c271378e5bdc09af3e75118088118cb01c39695"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.763709 4763 generic.go:334] "Generic (PLEG): container finished" podID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerID="7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386" exitCode=0 Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.763801 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5zxt" event={"ID":"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca","Type":"ContainerDied","Data":"7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.763853 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5zxt" event={"ID":"e364490a-9e0b-45fb-bbf3-f6c603f7e9ca","Type":"ContainerDied","Data":"d56ef74d6b509967d9caa0310542280aa1c7ef3e142e0d6567e701db9c167a66"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.763874 4763 scope.go:117] "RemoveContainer" containerID="7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.763814 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5zxt" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.767301 4763 generic.go:334] "Generic (PLEG): container finished" podID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerID="3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609" exitCode=0 Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.767338 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zd9hq" event={"ID":"c6c90a53-6c57-4fd5-b9d7-77243c584cc4","Type":"ContainerDied","Data":"3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.767352 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zd9hq" event={"ID":"c6c90a53-6c57-4fd5-b9d7-77243c584cc4","Type":"ContainerDied","Data":"3ff71cd77a97b853ea94808baaceb713ce930459390bcca16b8eec1b8cf60db6"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.767398 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zd9hq" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.770783 4763 generic.go:334] "Generic (PLEG): container finished" podID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerID="d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0" exitCode=0 Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.770842 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cf9s" event={"ID":"d61abe4d-a32e-4520-a8a5-bb9f111a7c28","Type":"ContainerDied","Data":"d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.770907 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cf9s" event={"ID":"d61abe4d-a32e-4520-a8a5-bb9f111a7c28","Type":"ContainerDied","Data":"2137dd38c0bdf410dceee4bc5eb93b2174dd2d98fa999a4e1be0ec02a13cf137"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.770868 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7cf9s" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.772659 4763 generic.go:334] "Generic (PLEG): container finished" podID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerID="eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5" exitCode=0 Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.772741 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nczcd" event={"ID":"26eaff77-0592-4caa-8fa9-f2ad86445c41","Type":"ContainerDied","Data":"eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.772810 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nczcd" event={"ID":"26eaff77-0592-4caa-8fa9-f2ad86445c41","Type":"ContainerDied","Data":"0c1f608501dabda7f8f9a7bed575e37c2d1ad85bb96f96b1563a8e6e5af6e735"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.772917 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nczcd" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.776187 4763 generic.go:334] "Generic (PLEG): container finished" podID="ef7f0e05-29c2-4890-8d13-0466593e1fa8" containerID="c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be" exitCode=0 Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.776228 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" event={"ID":"ef7f0e05-29c2-4890-8d13-0466593e1fa8","Type":"ContainerDied","Data":"c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.776253 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" event={"ID":"ef7f0e05-29c2-4890-8d13-0466593e1fa8","Type":"ContainerDied","Data":"3a58529215a0e7e18c4b1c5e457a511d4c153654088068d85d9f104a702dbcb0"} Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.776396 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmrz8" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.778155 4763 scope.go:117] "RemoveContainer" containerID="033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.798786 4763 scope.go:117] "RemoveContainer" containerID="952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.799708 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zd9hq"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.802426 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zd9hq"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.819168 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmrz8"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.827504 4763 scope.go:117] "RemoveContainer" containerID="7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.828087 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386\": container with ID starting with 7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386 not found: ID does not exist" containerID="7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.828122 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386"} err="failed to get container status \"7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386\": rpc error: code = NotFound desc = could not find container \"7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386\": container with ID starting with 7a3ba7482ea69ec0de084f044f293e7714c319b619f5c877267dc423f4716386 not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.828141 4763 scope.go:117] "RemoveContainer" containerID="033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.828422 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8\": container with ID starting with 033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8 not found: ID does not exist" containerID="033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.828445 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8"} err="failed to get container status \"033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8\": rpc error: code = NotFound desc = could not find container \"033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8\": container with ID starting with 033454b76e5600b6cd953ac8cdce004c4aaa2a48561becd2cecf2abd2ea9d6d8 not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.828459 4763 scope.go:117] "RemoveContainer" containerID="952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.828841 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f\": container with ID starting with 952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f not found: ID does not exist" containerID="952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.828863 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f"} err="failed to get container status \"952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f\": rpc error: code = NotFound desc = could not find container \"952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f\": container with ID starting with 952c04f39dbddad8c31241ba4c7149d75c428b123bec8aaecdadd6e60638e21f not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.828882 4763 scope.go:117] "RemoveContainer" containerID="3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.830888 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmrz8"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.834205 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l5zxt"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.841868 4763 scope.go:117] "RemoveContainer" containerID="8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.845073 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l5zxt"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.865648 4763 scope.go:117] "RemoveContainer" containerID="cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.866596 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nczcd"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.872101 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nczcd"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.878576 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7cf9s"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.882181 4763 scope.go:117] "RemoveContainer" containerID="3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.882638 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609\": container with ID starting with 3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609 not found: ID does not exist" containerID="3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.882668 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609"} err="failed to get container status \"3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609\": rpc error: code = NotFound desc = could not find container \"3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609\": container with ID starting with 3909fb5f06801ff91d9e8035499f01cff72699e2d58c5eb2b44075ce57297609 not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.882711 4763 scope.go:117] "RemoveContainer" containerID="8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.882983 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8\": container with ID starting with 8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8 not found: ID does not exist" containerID="8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.883014 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8"} err="failed to get container status \"8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8\": rpc error: code = NotFound desc = could not find container \"8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8\": container with ID starting with 8454b65e24d561a07e1f320bff793ceb4a5287bd0c703aafbe15a6ca5a9938b8 not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.883034 4763 scope.go:117] "RemoveContainer" containerID="cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.883231 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436\": container with ID starting with cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436 not found: ID does not exist" containerID="cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.883258 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436"} err="failed to get container status \"cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436\": rpc error: code = NotFound desc = could not find container \"cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436\": container with ID starting with cd7b36a973ba5eaa44d01195e8ffcf1039b9c90b1cbbbe82d6030ff08f492436 not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.883277 4763 scope.go:117] "RemoveContainer" containerID="d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.884090 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7cf9s"] Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.898170 4763 scope.go:117] "RemoveContainer" containerID="66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.912748 4763 scope.go:117] "RemoveContainer" containerID="a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.926516 4763 scope.go:117] "RemoveContainer" containerID="d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.926788 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0\": container with ID starting with d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0 not found: ID does not exist" containerID="d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.926827 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0"} err="failed to get container status \"d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0\": rpc error: code = NotFound desc = could not find container \"d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0\": container with ID starting with d87c902599d7316e42bc047278b539143ac8a55e17977fea965a5a25ea3e72b0 not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.926852 4763 scope.go:117] "RemoveContainer" containerID="66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.927151 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec\": container with ID starting with 66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec not found: ID does not exist" containerID="66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.927171 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec"} err="failed to get container status \"66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec\": rpc error: code = NotFound desc = could not find container \"66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec\": container with ID starting with 66cf1843f13cc90d329cffac61ecf4263224c449b43fbb209114063fe1d8e6ec not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.927188 4763 scope.go:117] "RemoveContainer" containerID="a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.927394 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6\": container with ID starting with a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6 not found: ID does not exist" containerID="a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.927420 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6"} err="failed to get container status \"a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6\": rpc error: code = NotFound desc = could not find container \"a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6\": container with ID starting with a1d11cbc0819a5a710c04d01ec91585418d95f1e23108980d8d1911160e205a6 not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.927438 4763 scope.go:117] "RemoveContainer" containerID="eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.939753 4763 scope.go:117] "RemoveContainer" containerID="0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.951533 4763 scope.go:117] "RemoveContainer" containerID="92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.970001 4763 scope.go:117] "RemoveContainer" containerID="eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.970551 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5\": container with ID starting with eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5 not found: ID does not exist" containerID="eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.970586 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5"} err="failed to get container status \"eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5\": rpc error: code = NotFound desc = could not find container \"eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5\": container with ID starting with eff8d766c54f8d742516748403e4e8e75eb645179baac4a74f2b8b27b1ff11c5 not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.970641 4763 scope.go:117] "RemoveContainer" containerID="0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.971123 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c\": container with ID starting with 0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c not found: ID does not exist" containerID="0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.971162 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c"} err="failed to get container status \"0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c\": rpc error: code = NotFound desc = could not find container \"0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c\": container with ID starting with 0f809b3e12fc9fcc5f96106ad3ab94cac7d53eb60d11414988ed9ba0a28ab33c not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.971194 4763 scope.go:117] "RemoveContainer" containerID="92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.971496 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490\": container with ID starting with 92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490 not found: ID does not exist" containerID="92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.971517 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490"} err="failed to get container status \"92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490\": rpc error: code = NotFound desc = could not find container \"92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490\": container with ID starting with 92808ca679999de019d77d19329e385560c290d5d25a258be46b1601f71df490 not found: ID does not exist" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.971534 4763 scope.go:117] "RemoveContainer" containerID="c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.984156 4763 scope.go:117] "RemoveContainer" containerID="c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be" Oct 06 14:57:40 crc kubenswrapper[4763]: E1006 14:57:40.984760 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be\": container with ID starting with c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be not found: ID does not exist" containerID="c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be" Oct 06 14:57:40 crc kubenswrapper[4763]: I1006 14:57:40.984813 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be"} err="failed to get container status \"c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be\": rpc error: code = NotFound desc = could not find container \"c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be\": container with ID starting with c396ad96ac9242b9599908a4dd9a8c028415ea961aa7117a3e75e8f6e42c13be not found: ID does not exist" Oct 06 14:57:41 crc kubenswrapper[4763]: I1006 14:57:41.587991 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" path="/var/lib/kubelet/pods/26eaff77-0592-4caa-8fa9-f2ad86445c41/volumes" Oct 06 14:57:41 crc kubenswrapper[4763]: I1006 14:57:41.589877 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" path="/var/lib/kubelet/pods/c6c90a53-6c57-4fd5-b9d7-77243c584cc4/volumes" Oct 06 14:57:41 crc kubenswrapper[4763]: I1006 14:57:41.590425 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" path="/var/lib/kubelet/pods/d61abe4d-a32e-4520-a8a5-bb9f111a7c28/volumes" Oct 06 14:57:41 crc kubenswrapper[4763]: I1006 14:57:41.590976 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" path="/var/lib/kubelet/pods/e364490a-9e0b-45fb-bbf3-f6c603f7e9ca/volumes" Oct 06 14:57:41 crc kubenswrapper[4763]: I1006 14:57:41.591922 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef7f0e05-29c2-4890-8d13-0466593e1fa8" path="/var/lib/kubelet/pods/ef7f0e05-29c2-4890-8d13-0466593e1fa8/volumes" Oct 06 14:57:41 crc kubenswrapper[4763]: I1006 14:57:41.788009 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" event={"ID":"dfcf8a62-1f4d-4af8-8bec-5eabdba0df59","Type":"ContainerStarted","Data":"d7b9f16658f81d346e60a98058857c3c5d7f96b115e7a1c8693eb34a04e7ba20"} Oct 06 14:57:41 crc kubenswrapper[4763]: I1006 14:57:41.788226 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:41 crc kubenswrapper[4763]: I1006 14:57:41.791701 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" Oct 06 14:57:41 crc kubenswrapper[4763]: I1006 14:57:41.824397 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dwnx6" podStartSLOduration=2.824378017 podStartE2EDuration="2.824378017s" podCreationTimestamp="2025-10-06 14:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:57:41.803306577 +0000 UTC m=+258.958599109" watchObservedRunningTime="2025-10-06 14:57:41.824378017 +0000 UTC m=+258.979670539" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139027 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c24vp"] Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139206 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerName="extract-utilities" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139218 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerName="extract-utilities" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139227 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerName="extract-utilities" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139233 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerName="extract-utilities" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139241 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139247 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139255 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerName="extract-content" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139261 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerName="extract-content" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139269 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139275 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139283 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerName="extract-utilities" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139289 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerName="extract-utilities" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139296 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerName="extract-content" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139302 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerName="extract-content" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139311 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerName="extract-content" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139317 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerName="extract-content" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139324 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7f0e05-29c2-4890-8d13-0466593e1fa8" containerName="marketplace-operator" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139330 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7f0e05-29c2-4890-8d13-0466593e1fa8" containerName="marketplace-operator" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139338 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerName="extract-utilities" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139345 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerName="extract-utilities" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139355 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerName="extract-content" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139364 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerName="extract-content" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139372 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139379 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: E1006 14:57:42.139386 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139393 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139492 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61abe4d-a32e-4520-a8a5-bb9f111a7c28" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139506 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="26eaff77-0592-4caa-8fa9-f2ad86445c41" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139517 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7f0e05-29c2-4890-8d13-0466593e1fa8" containerName="marketplace-operator" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139524 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c90a53-6c57-4fd5-b9d7-77243c584cc4" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.139532 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e364490a-9e0b-45fb-bbf3-f6c603f7e9ca" containerName="registry-server" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.140193 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.142798 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.145669 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c24vp"] Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.270993 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6522382-5c77-4a83-9955-edec44f773dc-utilities\") pod \"certified-operators-c24vp\" (UID: \"f6522382-5c77-4a83-9955-edec44f773dc\") " pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.271041 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqnxn\" (UniqueName: \"kubernetes.io/projected/f6522382-5c77-4a83-9955-edec44f773dc-kube-api-access-kqnxn\") pod \"certified-operators-c24vp\" (UID: \"f6522382-5c77-4a83-9955-edec44f773dc\") " pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.271199 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6522382-5c77-4a83-9955-edec44f773dc-catalog-content\") pod \"certified-operators-c24vp\" (UID: \"f6522382-5c77-4a83-9955-edec44f773dc\") " pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.337371 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-66hlg"] Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.338860 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.342145 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.349468 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66hlg"] Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.373035 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6522382-5c77-4a83-9955-edec44f773dc-utilities\") pod \"certified-operators-c24vp\" (UID: \"f6522382-5c77-4a83-9955-edec44f773dc\") " pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.373078 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqnxn\" (UniqueName: \"kubernetes.io/projected/f6522382-5c77-4a83-9955-edec44f773dc-kube-api-access-kqnxn\") pod \"certified-operators-c24vp\" (UID: \"f6522382-5c77-4a83-9955-edec44f773dc\") " pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.373114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6522382-5c77-4a83-9955-edec44f773dc-catalog-content\") pod \"certified-operators-c24vp\" (UID: \"f6522382-5c77-4a83-9955-edec44f773dc\") " pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.373597 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6522382-5c77-4a83-9955-edec44f773dc-utilities\") pod \"certified-operators-c24vp\" (UID: \"f6522382-5c77-4a83-9955-edec44f773dc\") " pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.373799 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6522382-5c77-4a83-9955-edec44f773dc-catalog-content\") pod \"certified-operators-c24vp\" (UID: \"f6522382-5c77-4a83-9955-edec44f773dc\") " pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.401077 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqnxn\" (UniqueName: \"kubernetes.io/projected/f6522382-5c77-4a83-9955-edec44f773dc-kube-api-access-kqnxn\") pod \"certified-operators-c24vp\" (UID: \"f6522382-5c77-4a83-9955-edec44f773dc\") " pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.457270 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.474420 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b659ecc7-c237-4169-8512-3f4e9ad3133b-utilities\") pod \"redhat-marketplace-66hlg\" (UID: \"b659ecc7-c237-4169-8512-3f4e9ad3133b\") " pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.474473 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zsr4\" (UniqueName: \"kubernetes.io/projected/b659ecc7-c237-4169-8512-3f4e9ad3133b-kube-api-access-5zsr4\") pod \"redhat-marketplace-66hlg\" (UID: \"b659ecc7-c237-4169-8512-3f4e9ad3133b\") " pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.474520 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b659ecc7-c237-4169-8512-3f4e9ad3133b-catalog-content\") pod \"redhat-marketplace-66hlg\" (UID: \"b659ecc7-c237-4169-8512-3f4e9ad3133b\") " pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.575311 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b659ecc7-c237-4169-8512-3f4e9ad3133b-catalog-content\") pod \"redhat-marketplace-66hlg\" (UID: \"b659ecc7-c237-4169-8512-3f4e9ad3133b\") " pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.575748 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b659ecc7-c237-4169-8512-3f4e9ad3133b-utilities\") pod \"redhat-marketplace-66hlg\" (UID: \"b659ecc7-c237-4169-8512-3f4e9ad3133b\") " pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.575771 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zsr4\" (UniqueName: \"kubernetes.io/projected/b659ecc7-c237-4169-8512-3f4e9ad3133b-kube-api-access-5zsr4\") pod \"redhat-marketplace-66hlg\" (UID: \"b659ecc7-c237-4169-8512-3f4e9ad3133b\") " pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.576090 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b659ecc7-c237-4169-8512-3f4e9ad3133b-utilities\") pod \"redhat-marketplace-66hlg\" (UID: \"b659ecc7-c237-4169-8512-3f4e9ad3133b\") " pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.576156 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b659ecc7-c237-4169-8512-3f4e9ad3133b-catalog-content\") pod \"redhat-marketplace-66hlg\" (UID: \"b659ecc7-c237-4169-8512-3f4e9ad3133b\") " pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.592451 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zsr4\" (UniqueName: \"kubernetes.io/projected/b659ecc7-c237-4169-8512-3f4e9ad3133b-kube-api-access-5zsr4\") pod \"redhat-marketplace-66hlg\" (UID: \"b659ecc7-c237-4169-8512-3f4e9ad3133b\") " pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.618337 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c24vp"] Oct 06 14:57:42 crc kubenswrapper[4763]: W1006 14:57:42.621873 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6522382_5c77_4a83_9955_edec44f773dc.slice/crio-200b5789dcf026d906490ec1f3b663e203d465385c372c23ff4c30be271c8fd5 WatchSource:0}: Error finding container 200b5789dcf026d906490ec1f3b663e203d465385c372c23ff4c30be271c8fd5: Status 404 returned error can't find the container with id 200b5789dcf026d906490ec1f3b663e203d465385c372c23ff4c30be271c8fd5 Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.656876 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.803145 4763 generic.go:334] "Generic (PLEG): container finished" podID="f6522382-5c77-4a83-9955-edec44f773dc" containerID="d3a3f9419f9b336559ca89c2ee101b7e0233f43cbc8be4750d09803633e1f168" exitCode=0 Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.803231 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24vp" event={"ID":"f6522382-5c77-4a83-9955-edec44f773dc","Type":"ContainerDied","Data":"d3a3f9419f9b336559ca89c2ee101b7e0233f43cbc8be4750d09803633e1f168"} Oct 06 14:57:42 crc kubenswrapper[4763]: I1006 14:57:42.803272 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24vp" event={"ID":"f6522382-5c77-4a83-9955-edec44f773dc","Type":"ContainerStarted","Data":"200b5789dcf026d906490ec1f3b663e203d465385c372c23ff4c30be271c8fd5"} Oct 06 14:57:43 crc kubenswrapper[4763]: I1006 14:57:43.058085 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66hlg"] Oct 06 14:57:43 crc kubenswrapper[4763]: W1006 14:57:43.062531 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb659ecc7_c237_4169_8512_3f4e9ad3133b.slice/crio-61a6971424e766ddce965664060bd3561ffe58786fb5eeab8295deb1b2f97b70 WatchSource:0}: Error finding container 61a6971424e766ddce965664060bd3561ffe58786fb5eeab8295deb1b2f97b70: Status 404 returned error can't find the container with id 61a6971424e766ddce965664060bd3561ffe58786fb5eeab8295deb1b2f97b70 Oct 06 14:57:43 crc kubenswrapper[4763]: I1006 14:57:43.809799 4763 generic.go:334] "Generic (PLEG): container finished" podID="b659ecc7-c237-4169-8512-3f4e9ad3133b" containerID="fec93f12c84f4f2d9b813bd1a3dc04c7b98bf114a771620811b25c4e7dfa4922" exitCode=0 Oct 06 14:57:43 crc kubenswrapper[4763]: I1006 14:57:43.809858 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66hlg" event={"ID":"b659ecc7-c237-4169-8512-3f4e9ad3133b","Type":"ContainerDied","Data":"fec93f12c84f4f2d9b813bd1a3dc04c7b98bf114a771620811b25c4e7dfa4922"} Oct 06 14:57:43 crc kubenswrapper[4763]: I1006 14:57:43.810168 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66hlg" event={"ID":"b659ecc7-c237-4169-8512-3f4e9ad3133b","Type":"ContainerStarted","Data":"61a6971424e766ddce965664060bd3561ffe58786fb5eeab8295deb1b2f97b70"} Oct 06 14:57:43 crc kubenswrapper[4763]: I1006 14:57:43.813122 4763 generic.go:334] "Generic (PLEG): container finished" podID="f6522382-5c77-4a83-9955-edec44f773dc" containerID="11ae682b45c6a79191b216f6dd1631913e7813fb3459c8adeef170d2f711650e" exitCode=0 Oct 06 14:57:43 crc kubenswrapper[4763]: I1006 14:57:43.813506 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24vp" event={"ID":"f6522382-5c77-4a83-9955-edec44f773dc","Type":"ContainerDied","Data":"11ae682b45c6a79191b216f6dd1631913e7813fb3459c8adeef170d2f711650e"} Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.543819 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7xjg2"] Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.545828 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.546853 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7xjg2"] Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.548481 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.707068 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d033ece4-a5ff-4492-92e9-2a2af22d0ce9-utilities\") pod \"community-operators-7xjg2\" (UID: \"d033ece4-a5ff-4492-92e9-2a2af22d0ce9\") " pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.707122 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d033ece4-a5ff-4492-92e9-2a2af22d0ce9-catalog-content\") pod \"community-operators-7xjg2\" (UID: \"d033ece4-a5ff-4492-92e9-2a2af22d0ce9\") " pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.707220 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvqq2\" (UniqueName: \"kubernetes.io/projected/d033ece4-a5ff-4492-92e9-2a2af22d0ce9-kube-api-access-lvqq2\") pod \"community-operators-7xjg2\" (UID: \"d033ece4-a5ff-4492-92e9-2a2af22d0ce9\") " pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.740067 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rj6qt"] Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.741306 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.744815 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.760914 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rj6qt"] Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.808344 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d033ece4-a5ff-4492-92e9-2a2af22d0ce9-utilities\") pod \"community-operators-7xjg2\" (UID: \"d033ece4-a5ff-4492-92e9-2a2af22d0ce9\") " pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.808396 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d033ece4-a5ff-4492-92e9-2a2af22d0ce9-catalog-content\") pod \"community-operators-7xjg2\" (UID: \"d033ece4-a5ff-4492-92e9-2a2af22d0ce9\") " pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.808455 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvqq2\" (UniqueName: \"kubernetes.io/projected/d033ece4-a5ff-4492-92e9-2a2af22d0ce9-kube-api-access-lvqq2\") pod \"community-operators-7xjg2\" (UID: \"d033ece4-a5ff-4492-92e9-2a2af22d0ce9\") " pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.809292 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d033ece4-a5ff-4492-92e9-2a2af22d0ce9-utilities\") pod \"community-operators-7xjg2\" (UID: \"d033ece4-a5ff-4492-92e9-2a2af22d0ce9\") " pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.809321 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d033ece4-a5ff-4492-92e9-2a2af22d0ce9-catalog-content\") pod \"community-operators-7xjg2\" (UID: \"d033ece4-a5ff-4492-92e9-2a2af22d0ce9\") " pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.824102 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24vp" event={"ID":"f6522382-5c77-4a83-9955-edec44f773dc","Type":"ContainerStarted","Data":"e84ca6f389d1653c817f68d34b04dcfad7318d0ed3ee1d6ff937f73907e25ff6"} Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.841712 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvqq2\" (UniqueName: \"kubernetes.io/projected/d033ece4-a5ff-4492-92e9-2a2af22d0ce9-kube-api-access-lvqq2\") pod \"community-operators-7xjg2\" (UID: \"d033ece4-a5ff-4492-92e9-2a2af22d0ce9\") " pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.843341 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c24vp" podStartSLOduration=1.3206907669999999 podStartE2EDuration="2.843315267s" podCreationTimestamp="2025-10-06 14:57:42 +0000 UTC" firstStartedPulling="2025-10-06 14:57:42.80442594 +0000 UTC m=+259.959718452" lastFinishedPulling="2025-10-06 14:57:44.32705044 +0000 UTC m=+261.482342952" observedRunningTime="2025-10-06 14:57:44.84129381 +0000 UTC m=+261.996586362" watchObservedRunningTime="2025-10-06 14:57:44.843315267 +0000 UTC m=+261.998607819" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.870628 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.909653 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0874cb8-27ec-46e8-b17d-f50e1a6c63ea-utilities\") pod \"redhat-operators-rj6qt\" (UID: \"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea\") " pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.909719 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd472\" (UniqueName: \"kubernetes.io/projected/f0874cb8-27ec-46e8-b17d-f50e1a6c63ea-kube-api-access-gd472\") pod \"redhat-operators-rj6qt\" (UID: \"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea\") " pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:44 crc kubenswrapper[4763]: I1006 14:57:44.909761 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0874cb8-27ec-46e8-b17d-f50e1a6c63ea-catalog-content\") pod \"redhat-operators-rj6qt\" (UID: \"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea\") " pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.011013 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0874cb8-27ec-46e8-b17d-f50e1a6c63ea-catalog-content\") pod \"redhat-operators-rj6qt\" (UID: \"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea\") " pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.011396 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0874cb8-27ec-46e8-b17d-f50e1a6c63ea-utilities\") pod \"redhat-operators-rj6qt\" (UID: \"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea\") " pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.011585 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0874cb8-27ec-46e8-b17d-f50e1a6c63ea-catalog-content\") pod \"redhat-operators-rj6qt\" (UID: \"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea\") " pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.012224 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd472\" (UniqueName: \"kubernetes.io/projected/f0874cb8-27ec-46e8-b17d-f50e1a6c63ea-kube-api-access-gd472\") pod \"redhat-operators-rj6qt\" (UID: \"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea\") " pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.012975 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0874cb8-27ec-46e8-b17d-f50e1a6c63ea-utilities\") pod \"redhat-operators-rj6qt\" (UID: \"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea\") " pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.030129 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd472\" (UniqueName: \"kubernetes.io/projected/f0874cb8-27ec-46e8-b17d-f50e1a6c63ea-kube-api-access-gd472\") pod \"redhat-operators-rj6qt\" (UID: \"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea\") " pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.073833 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7xjg2"] Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.075318 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:45 crc kubenswrapper[4763]: W1006 14:57:45.139158 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd033ece4_a5ff_4492_92e9_2a2af22d0ce9.slice/crio-56c57a526d627bda4ae6241054c9f9e174210eed0f9b5ab7dbc1b5646292d6a1 WatchSource:0}: Error finding container 56c57a526d627bda4ae6241054c9f9e174210eed0f9b5ab7dbc1b5646292d6a1: Status 404 returned error can't find the container with id 56c57a526d627bda4ae6241054c9f9e174210eed0f9b5ab7dbc1b5646292d6a1 Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.276189 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rj6qt"] Oct 06 14:57:45 crc kubenswrapper[4763]: W1006 14:57:45.293167 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0874cb8_27ec_46e8_b17d_f50e1a6c63ea.slice/crio-6b2a349ca35abc984d234f7ccc4eac0eef9616717b78adf74beb3d1b5edf59b1 WatchSource:0}: Error finding container 6b2a349ca35abc984d234f7ccc4eac0eef9616717b78adf74beb3d1b5edf59b1: Status 404 returned error can't find the container with id 6b2a349ca35abc984d234f7ccc4eac0eef9616717b78adf74beb3d1b5edf59b1 Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.832224 4763 generic.go:334] "Generic (PLEG): container finished" podID="f0874cb8-27ec-46e8-b17d-f50e1a6c63ea" containerID="1750d092f883bf1948a60abfa1c50ff29e79cd0e675d5a86132798a15e8bea34" exitCode=0 Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.832272 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rj6qt" event={"ID":"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea","Type":"ContainerDied","Data":"1750d092f883bf1948a60abfa1c50ff29e79cd0e675d5a86132798a15e8bea34"} Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.832565 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rj6qt" event={"ID":"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea","Type":"ContainerStarted","Data":"6b2a349ca35abc984d234f7ccc4eac0eef9616717b78adf74beb3d1b5edf59b1"} Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.836522 4763 generic.go:334] "Generic (PLEG): container finished" podID="d033ece4-a5ff-4492-92e9-2a2af22d0ce9" containerID="50f185d0a787302f358b4ea90294d453b4281d0701738c030e84241a822f9388" exitCode=0 Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.836629 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xjg2" event={"ID":"d033ece4-a5ff-4492-92e9-2a2af22d0ce9","Type":"ContainerDied","Data":"50f185d0a787302f358b4ea90294d453b4281d0701738c030e84241a822f9388"} Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.836659 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xjg2" event={"ID":"d033ece4-a5ff-4492-92e9-2a2af22d0ce9","Type":"ContainerStarted","Data":"56c57a526d627bda4ae6241054c9f9e174210eed0f9b5ab7dbc1b5646292d6a1"} Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.838896 4763 generic.go:334] "Generic (PLEG): container finished" podID="b659ecc7-c237-4169-8512-3f4e9ad3133b" containerID="e227639707fc261ab42fae3c5cb7cfbb1e63cc2f021809cfdf2ab42a3da74aae" exitCode=0 Oct 06 14:57:45 crc kubenswrapper[4763]: I1006 14:57:45.839334 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66hlg" event={"ID":"b659ecc7-c237-4169-8512-3f4e9ad3133b","Type":"ContainerDied","Data":"e227639707fc261ab42fae3c5cb7cfbb1e63cc2f021809cfdf2ab42a3da74aae"} Oct 06 14:57:47 crc kubenswrapper[4763]: I1006 14:57:47.858944 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rj6qt" event={"ID":"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea","Type":"ContainerStarted","Data":"bb9dd6e30bce10f6afce519df9781434c51d261b1a4d8722d7cb8fda21ac1216"} Oct 06 14:57:47 crc kubenswrapper[4763]: I1006 14:57:47.861137 4763 generic.go:334] "Generic (PLEG): container finished" podID="d033ece4-a5ff-4492-92e9-2a2af22d0ce9" containerID="5f55a5fe227eb63def11ca854e4ee0bfdfe255605e982dd72b0544e078bb16aa" exitCode=0 Oct 06 14:57:47 crc kubenswrapper[4763]: I1006 14:57:47.861299 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xjg2" event={"ID":"d033ece4-a5ff-4492-92e9-2a2af22d0ce9","Type":"ContainerDied","Data":"5f55a5fe227eb63def11ca854e4ee0bfdfe255605e982dd72b0544e078bb16aa"} Oct 06 14:57:47 crc kubenswrapper[4763]: I1006 14:57:47.863659 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66hlg" event={"ID":"b659ecc7-c237-4169-8512-3f4e9ad3133b","Type":"ContainerStarted","Data":"3326c7b969db0feebfaa234fe00ff9486aefee434f3b355d6cd983cfd4ba9536"} Oct 06 14:57:47 crc kubenswrapper[4763]: I1006 14:57:47.917731 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-66hlg" podStartSLOduration=3.499605667 podStartE2EDuration="5.917716017s" podCreationTimestamp="2025-10-06 14:57:42 +0000 UTC" firstStartedPulling="2025-10-06 14:57:43.812759883 +0000 UTC m=+260.968052395" lastFinishedPulling="2025-10-06 14:57:46.230870193 +0000 UTC m=+263.386162745" observedRunningTime="2025-10-06 14:57:47.914209648 +0000 UTC m=+265.069502200" watchObservedRunningTime="2025-10-06 14:57:47.917716017 +0000 UTC m=+265.073008529" Oct 06 14:57:48 crc kubenswrapper[4763]: I1006 14:57:48.872298 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xjg2" event={"ID":"d033ece4-a5ff-4492-92e9-2a2af22d0ce9","Type":"ContainerStarted","Data":"9bb9074895c1d05bb9bcdae292a5f30e51ecce33150f27832a01e75569fe5fb8"} Oct 06 14:57:48 crc kubenswrapper[4763]: I1006 14:57:48.874638 4763 generic.go:334] "Generic (PLEG): container finished" podID="f0874cb8-27ec-46e8-b17d-f50e1a6c63ea" containerID="bb9dd6e30bce10f6afce519df9781434c51d261b1a4d8722d7cb8fda21ac1216" exitCode=0 Oct 06 14:57:48 crc kubenswrapper[4763]: I1006 14:57:48.874741 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rj6qt" event={"ID":"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea","Type":"ContainerDied","Data":"bb9dd6e30bce10f6afce519df9781434c51d261b1a4d8722d7cb8fda21ac1216"} Oct 06 14:57:48 crc kubenswrapper[4763]: I1006 14:57:48.898782 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7xjg2" podStartSLOduration=2.480814171 podStartE2EDuration="4.898754198s" podCreationTimestamp="2025-10-06 14:57:44 +0000 UTC" firstStartedPulling="2025-10-06 14:57:45.837967645 +0000 UTC m=+262.993260177" lastFinishedPulling="2025-10-06 14:57:48.255907682 +0000 UTC m=+265.411200204" observedRunningTime="2025-10-06 14:57:48.892786549 +0000 UTC m=+266.048079091" watchObservedRunningTime="2025-10-06 14:57:48.898754198 +0000 UTC m=+266.054046740" Oct 06 14:57:49 crc kubenswrapper[4763]: I1006 14:57:49.884858 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rj6qt" event={"ID":"f0874cb8-27ec-46e8-b17d-f50e1a6c63ea","Type":"ContainerStarted","Data":"94503c621ed9db896bbe2a10d476affc5c47cc8661ac3e53cd82846658b278f4"} Oct 06 14:57:49 crc kubenswrapper[4763]: I1006 14:57:49.902663 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rj6qt" podStartSLOduration=2.391229137 podStartE2EDuration="5.902644688s" podCreationTimestamp="2025-10-06 14:57:44 +0000 UTC" firstStartedPulling="2025-10-06 14:57:45.83499399 +0000 UTC m=+262.990286502" lastFinishedPulling="2025-10-06 14:57:49.346409541 +0000 UTC m=+266.501702053" observedRunningTime="2025-10-06 14:57:49.901920007 +0000 UTC m=+267.057212549" watchObservedRunningTime="2025-10-06 14:57:49.902644688 +0000 UTC m=+267.057937230" Oct 06 14:57:52 crc kubenswrapper[4763]: I1006 14:57:52.458050 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:52 crc kubenswrapper[4763]: I1006 14:57:52.458307 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:52 crc kubenswrapper[4763]: I1006 14:57:52.501512 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:52 crc kubenswrapper[4763]: I1006 14:57:52.657711 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:52 crc kubenswrapper[4763]: I1006 14:57:52.657800 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:52 crc kubenswrapper[4763]: I1006 14:57:52.707806 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:52 crc kubenswrapper[4763]: I1006 14:57:52.951193 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-66hlg" Oct 06 14:57:52 crc kubenswrapper[4763]: I1006 14:57:52.954648 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c24vp" Oct 06 14:57:54 crc kubenswrapper[4763]: I1006 14:57:54.871885 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:54 crc kubenswrapper[4763]: I1006 14:57:54.871972 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:54 crc kubenswrapper[4763]: I1006 14:57:54.922098 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:57:55 crc kubenswrapper[4763]: I1006 14:57:55.077302 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:55 crc kubenswrapper[4763]: I1006 14:57:55.077363 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:55 crc kubenswrapper[4763]: I1006 14:57:55.137256 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:55 crc kubenswrapper[4763]: I1006 14:57:55.960721 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rj6qt" Oct 06 14:57:55 crc kubenswrapper[4763]: I1006 14:57:55.962709 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7xjg2" Oct 06 14:59:33 crc kubenswrapper[4763]: I1006 14:59:33.877285 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:59:33 crc kubenswrapper[4763]: I1006 14:59:33.878211 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.158412 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc"] Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.160528 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.163474 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.166185 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.173931 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc"] Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.250290 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlmx4\" (UniqueName: \"kubernetes.io/projected/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-kube-api-access-dlmx4\") pod \"collect-profiles-29329380-hpxqc\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.250397 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-config-volume\") pod \"collect-profiles-29329380-hpxqc\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.250450 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-secret-volume\") pod \"collect-profiles-29329380-hpxqc\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.351777 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlmx4\" (UniqueName: \"kubernetes.io/projected/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-kube-api-access-dlmx4\") pod \"collect-profiles-29329380-hpxqc\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.351862 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-config-volume\") pod \"collect-profiles-29329380-hpxqc\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.351901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-secret-volume\") pod \"collect-profiles-29329380-hpxqc\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.353121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-config-volume\") pod \"collect-profiles-29329380-hpxqc\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.361089 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-secret-volume\") pod \"collect-profiles-29329380-hpxqc\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.385483 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlmx4\" (UniqueName: \"kubernetes.io/projected/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-kube-api-access-dlmx4\") pod \"collect-profiles-29329380-hpxqc\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.495699 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:00 crc kubenswrapper[4763]: I1006 15:00:00.769664 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc"] Oct 06 15:00:01 crc kubenswrapper[4763]: I1006 15:00:01.736229 4763 generic.go:334] "Generic (PLEG): container finished" podID="bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe" containerID="0ee34648dbf050b997ea9e29f52627bfdcf9dfb106a1a2ac300080c1dc6bf4f5" exitCode=0 Oct 06 15:00:01 crc kubenswrapper[4763]: I1006 15:00:01.736586 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" event={"ID":"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe","Type":"ContainerDied","Data":"0ee34648dbf050b997ea9e29f52627bfdcf9dfb106a1a2ac300080c1dc6bf4f5"} Oct 06 15:00:01 crc kubenswrapper[4763]: I1006 15:00:01.736652 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" event={"ID":"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe","Type":"ContainerStarted","Data":"fa6a096e1f0dd4e222c9eb69a579f6ecc6a4dfd33f9b4c335c2a4087e01e41fe"} Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.043333 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.189208 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-config-volume\") pod \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.189293 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlmx4\" (UniqueName: \"kubernetes.io/projected/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-kube-api-access-dlmx4\") pod \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.189398 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-secret-volume\") pod \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\" (UID: \"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe\") " Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.190689 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-config-volume" (OuterVolumeSpecName: "config-volume") pod "bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe" (UID: "bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.195684 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-kube-api-access-dlmx4" (OuterVolumeSpecName: "kube-api-access-dlmx4") pod "bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe" (UID: "bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe"). InnerVolumeSpecName "kube-api-access-dlmx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.196431 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe" (UID: "bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.291905 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.291983 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlmx4\" (UniqueName: \"kubernetes.io/projected/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-kube-api-access-dlmx4\") on node \"crc\" DevicePath \"\"" Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.292008 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.754239 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" event={"ID":"bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe","Type":"ContainerDied","Data":"fa6a096e1f0dd4e222c9eb69a579f6ecc6a4dfd33f9b4c335c2a4087e01e41fe"} Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.754298 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa6a096e1f0dd4e222c9eb69a579f6ecc6a4dfd33f9b4c335c2a4087e01e41fe" Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.754334 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc" Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.877290 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:00:03 crc kubenswrapper[4763]: I1006 15:00:03.877393 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.162847 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h6dsg"] Oct 06 15:00:07 crc kubenswrapper[4763]: E1006 15:00:07.163967 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe" containerName="collect-profiles" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.164084 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe" containerName="collect-profiles" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.164281 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe" containerName="collect-profiles" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.164813 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.185854 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h6dsg"] Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.345365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbab1501-c266-4dcf-8499-581d30bbf0da-bound-sa-token\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.345972 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbab1501-c266-4dcf-8499-581d30bbf0da-registry-tls\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.346212 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.346451 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbab1501-c266-4dcf-8499-581d30bbf0da-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.346676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbab1501-c266-4dcf-8499-581d30bbf0da-trusted-ca\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.346907 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plq79\" (UniqueName: \"kubernetes.io/projected/cbab1501-c266-4dcf-8499-581d30bbf0da-kube-api-access-plq79\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.347132 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbab1501-c266-4dcf-8499-581d30bbf0da-registry-certificates\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.347366 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbab1501-c266-4dcf-8499-581d30bbf0da-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.381095 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.449264 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbab1501-c266-4dcf-8499-581d30bbf0da-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.449808 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbab1501-c266-4dcf-8499-581d30bbf0da-bound-sa-token\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.450111 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbab1501-c266-4dcf-8499-581d30bbf0da-registry-tls\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.450383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbab1501-c266-4dcf-8499-581d30bbf0da-trusted-ca\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.450609 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbab1501-c266-4dcf-8499-581d30bbf0da-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.451416 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbab1501-c266-4dcf-8499-581d30bbf0da-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.451770 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plq79\" (UniqueName: \"kubernetes.io/projected/cbab1501-c266-4dcf-8499-581d30bbf0da-kube-api-access-plq79\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.451978 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbab1501-c266-4dcf-8499-581d30bbf0da-registry-certificates\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.452957 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbab1501-c266-4dcf-8499-581d30bbf0da-trusted-ca\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.453724 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbab1501-c266-4dcf-8499-581d30bbf0da-registry-tls\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.455436 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbab1501-c266-4dcf-8499-581d30bbf0da-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.454368 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbab1501-c266-4dcf-8499-581d30bbf0da-registry-certificates\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.472670 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plq79\" (UniqueName: \"kubernetes.io/projected/cbab1501-c266-4dcf-8499-581d30bbf0da-kube-api-access-plq79\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.474366 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbab1501-c266-4dcf-8499-581d30bbf0da-bound-sa-token\") pod \"image-registry-66df7c8f76-h6dsg\" (UID: \"cbab1501-c266-4dcf-8499-581d30bbf0da\") " pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.485776 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.706919 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h6dsg"] Oct 06 15:00:07 crc kubenswrapper[4763]: I1006 15:00:07.782485 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" event={"ID":"cbab1501-c266-4dcf-8499-581d30bbf0da","Type":"ContainerStarted","Data":"c5cbc89ddaa804560f4a05582e09f86380a9e916ceab9cf5c00eff6df4a192b5"} Oct 06 15:00:08 crc kubenswrapper[4763]: I1006 15:00:08.792427 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" event={"ID":"cbab1501-c266-4dcf-8499-581d30bbf0da","Type":"ContainerStarted","Data":"cb3548f1d3adc776f35cbfeba7001045df6be16b1226aee1d0ef93bfa4918a99"} Oct 06 15:00:08 crc kubenswrapper[4763]: I1006 15:00:08.792689 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:08 crc kubenswrapper[4763]: I1006 15:00:08.821730 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" podStartSLOduration=1.8217035830000001 podStartE2EDuration="1.821703583s" podCreationTimestamp="2025-10-06 15:00:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:00:08.82128732 +0000 UTC m=+405.976579872" watchObservedRunningTime="2025-10-06 15:00:08.821703583 +0000 UTC m=+405.976996165" Oct 06 15:00:27 crc kubenswrapper[4763]: I1006 15:00:27.491789 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-h6dsg" Oct 06 15:00:27 crc kubenswrapper[4763]: I1006 15:00:27.550484 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9wh2c"] Oct 06 15:00:33 crc kubenswrapper[4763]: I1006 15:00:33.877273 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:00:33 crc kubenswrapper[4763]: I1006 15:00:33.879271 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:00:33 crc kubenswrapper[4763]: I1006 15:00:33.879456 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:00:33 crc kubenswrapper[4763]: I1006 15:00:33.880418 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec7df7ad7c0a76d74f3aca8ef5c84926ad1838d5dc2831784880a51f821dd0d5"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:00:33 crc kubenswrapper[4763]: I1006 15:00:33.880687 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://ec7df7ad7c0a76d74f3aca8ef5c84926ad1838d5dc2831784880a51f821dd0d5" gracePeriod=600 Oct 06 15:00:34 crc kubenswrapper[4763]: I1006 15:00:34.963660 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="ec7df7ad7c0a76d74f3aca8ef5c84926ad1838d5dc2831784880a51f821dd0d5" exitCode=0 Oct 06 15:00:34 crc kubenswrapper[4763]: I1006 15:00:34.963713 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"ec7df7ad7c0a76d74f3aca8ef5c84926ad1838d5dc2831784880a51f821dd0d5"} Oct 06 15:00:34 crc kubenswrapper[4763]: I1006 15:00:34.964054 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"069367f1b19c60d5af5d9d31fac016bc92d6f93455be97f8f4ea4d72267ad7aa"} Oct 06 15:00:34 crc kubenswrapper[4763]: I1006 15:00:34.964084 4763 scope.go:117] "RemoveContainer" containerID="b425969bf880a8ca26eac409af3e2d73aa69fa98debc8c65b76f6dcc83374a5f" Oct 06 15:00:52 crc kubenswrapper[4763]: I1006 15:00:52.600728 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" podUID="4f380f6e-5ecf-460d-b7ec-9e7c36c21326" containerName="registry" containerID="cri-o://9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef" gracePeriod=30 Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.022935 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.084086 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f380f6e-5ecf-460d-b7ec-9e7c36c21326" containerID="9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef" exitCode=0 Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.084127 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" event={"ID":"4f380f6e-5ecf-460d-b7ec-9e7c36c21326","Type":"ContainerDied","Data":"9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef"} Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.084155 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" event={"ID":"4f380f6e-5ecf-460d-b7ec-9e7c36c21326","Type":"ContainerDied","Data":"0a1a3ed42c8058fe1a4a9dc68e66ddae950334884f7de07979d3253f19e30976"} Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.084175 4763 scope.go:117] "RemoveContainer" containerID="9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.084827 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9wh2c" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.107601 4763 scope.go:117] "RemoveContainer" containerID="9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef" Oct 06 15:00:53 crc kubenswrapper[4763]: E1006 15:00:53.108407 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef\": container with ID starting with 9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef not found: ID does not exist" containerID="9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.108467 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef"} err="failed to get container status \"9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef\": rpc error: code = NotFound desc = could not find container \"9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef\": container with ID starting with 9bbba5a07a37e46edb33199457c06fff102210c8201ff700bacaddf8127a2cef not found: ID does not exist" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.151993 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-bound-sa-token\") pod \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.152082 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-installation-pull-secrets\") pod \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.152244 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.152293 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-tls\") pod \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.152370 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-certificates\") pod \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.152465 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bncjz\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-kube-api-access-bncjz\") pod \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.152510 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-ca-trust-extracted\") pod \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.152551 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-trusted-ca\") pod \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\" (UID: \"4f380f6e-5ecf-460d-b7ec-9e7c36c21326\") " Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.154250 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4f380f6e-5ecf-460d-b7ec-9e7c36c21326" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.154293 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4f380f6e-5ecf-460d-b7ec-9e7c36c21326" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.161049 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4f380f6e-5ecf-460d-b7ec-9e7c36c21326" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.161763 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-kube-api-access-bncjz" (OuterVolumeSpecName: "kube-api-access-bncjz") pod "4f380f6e-5ecf-460d-b7ec-9e7c36c21326" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326"). InnerVolumeSpecName "kube-api-access-bncjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.162099 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4f380f6e-5ecf-460d-b7ec-9e7c36c21326" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.162949 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4f380f6e-5ecf-460d-b7ec-9e7c36c21326" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.167708 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4f380f6e-5ecf-460d-b7ec-9e7c36c21326" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.181170 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4f380f6e-5ecf-460d-b7ec-9e7c36c21326" (UID: "4f380f6e-5ecf-460d-b7ec-9e7c36c21326"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.254028 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bncjz\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-kube-api-access-bncjz\") on node \"crc\" DevicePath \"\"" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.254086 4763 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.254107 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.254126 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.254147 4763 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.254165 4763 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.254185 4763 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f380f6e-5ecf-460d-b7ec-9e7c36c21326-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.435376 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9wh2c"] Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.438590 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9wh2c"] Oct 06 15:00:53 crc kubenswrapper[4763]: I1006 15:00:53.586392 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f380f6e-5ecf-460d-b7ec-9e7c36c21326" path="/var/lib/kubelet/pods/4f380f6e-5ecf-460d-b7ec-9e7c36c21326/volumes" Oct 06 15:03:03 crc kubenswrapper[4763]: I1006 15:03:03.876376 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:03:03 crc kubenswrapper[4763]: I1006 15:03:03.876896 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:03:33 crc kubenswrapper[4763]: I1006 15:03:33.876809 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:03:33 crc kubenswrapper[4763]: I1006 15:03:33.880181 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:04:03 crc kubenswrapper[4763]: I1006 15:04:03.876273 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:04:03 crc kubenswrapper[4763]: I1006 15:04:03.876940 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:04:03 crc kubenswrapper[4763]: I1006 15:04:03.877000 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:04:03 crc kubenswrapper[4763]: I1006 15:04:03.877771 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"069367f1b19c60d5af5d9d31fac016bc92d6f93455be97f8f4ea4d72267ad7aa"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:04:03 crc kubenswrapper[4763]: I1006 15:04:03.877849 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://069367f1b19c60d5af5d9d31fac016bc92d6f93455be97f8f4ea4d72267ad7aa" gracePeriod=600 Oct 06 15:04:04 crc kubenswrapper[4763]: I1006 15:04:04.328765 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="069367f1b19c60d5af5d9d31fac016bc92d6f93455be97f8f4ea4d72267ad7aa" exitCode=0 Oct 06 15:04:04 crc kubenswrapper[4763]: I1006 15:04:04.329187 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"069367f1b19c60d5af5d9d31fac016bc92d6f93455be97f8f4ea4d72267ad7aa"} Oct 06 15:04:04 crc kubenswrapper[4763]: I1006 15:04:04.329219 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"0b167bea82fcc2f3729a095299a58826cc2314cb35b4e8eb0ed7c680899b999c"} Oct 06 15:04:04 crc kubenswrapper[4763]: I1006 15:04:04.329239 4763 scope.go:117] "RemoveContainer" containerID="ec7df7ad7c0a76d74f3aca8ef5c84926ad1838d5dc2831784880a51f821dd0d5" Oct 06 15:04:36 crc kubenswrapper[4763]: I1006 15:04:36.885560 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-b7qld"] Oct 06 15:04:36 crc kubenswrapper[4763]: E1006 15:04:36.886379 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f380f6e-5ecf-460d-b7ec-9e7c36c21326" containerName="registry" Oct 06 15:04:36 crc kubenswrapper[4763]: I1006 15:04:36.886393 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f380f6e-5ecf-460d-b7ec-9e7c36c21326" containerName="registry" Oct 06 15:04:36 crc kubenswrapper[4763]: I1006 15:04:36.886492 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f380f6e-5ecf-460d-b7ec-9e7c36c21326" containerName="registry" Oct 06 15:04:36 crc kubenswrapper[4763]: I1006 15:04:36.886894 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:36 crc kubenswrapper[4763]: I1006 15:04:36.889432 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 06 15:04:36 crc kubenswrapper[4763]: I1006 15:04:36.890256 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 06 15:04:36 crc kubenswrapper[4763]: I1006 15:04:36.891777 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 06 15:04:36 crc kubenswrapper[4763]: I1006 15:04:36.901415 4763 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6jdbs" Oct 06 15:04:36 crc kubenswrapper[4763]: I1006 15:04:36.912447 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-b7qld"] Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.018193 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-node-mnt\") pod \"crc-storage-crc-b7qld\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.018336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-crc-storage\") pod \"crc-storage-crc-b7qld\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.018419 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpzt9\" (UniqueName: \"kubernetes.io/projected/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-kube-api-access-rpzt9\") pod \"crc-storage-crc-b7qld\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.120849 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-crc-storage\") pod \"crc-storage-crc-b7qld\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.120953 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpzt9\" (UniqueName: \"kubernetes.io/projected/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-kube-api-access-rpzt9\") pod \"crc-storage-crc-b7qld\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.121133 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-node-mnt\") pod \"crc-storage-crc-b7qld\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.121502 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-node-mnt\") pod \"crc-storage-crc-b7qld\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.122192 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-crc-storage\") pod \"crc-storage-crc-b7qld\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.143231 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpzt9\" (UniqueName: \"kubernetes.io/projected/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-kube-api-access-rpzt9\") pod \"crc-storage-crc-b7qld\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.245244 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.513651 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-b7qld"] Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.527672 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 15:04:37 crc kubenswrapper[4763]: I1006 15:04:37.562937 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b7qld" event={"ID":"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33","Type":"ContainerStarted","Data":"633f63a2dc4ccee3a756163bc5fc3bcac76f617486c8480d0253e3957a798df7"} Oct 06 15:04:39 crc kubenswrapper[4763]: I1006 15:04:39.578461 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ccb8f11-9845-413f-b1a5-3c3b73e8fc33" containerID="4608cf4e7028faeeb3a1485936e706f8467c82a40c7c5826204521276501b127" exitCode=0 Oct 06 15:04:39 crc kubenswrapper[4763]: I1006 15:04:39.586988 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b7qld" event={"ID":"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33","Type":"ContainerDied","Data":"4608cf4e7028faeeb3a1485936e706f8467c82a40c7c5826204521276501b127"} Oct 06 15:04:40 crc kubenswrapper[4763]: I1006 15:04:40.912605 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:40 crc kubenswrapper[4763]: I1006 15:04:40.976382 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-node-mnt\") pod \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " Oct 06 15:04:40 crc kubenswrapper[4763]: I1006 15:04:40.976497 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-crc-storage\") pod \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " Oct 06 15:04:40 crc kubenswrapper[4763]: I1006 15:04:40.976536 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpzt9\" (UniqueName: \"kubernetes.io/projected/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-kube-api-access-rpzt9\") pod \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\" (UID: \"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33\") " Oct 06 15:04:40 crc kubenswrapper[4763]: I1006 15:04:40.976530 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "8ccb8f11-9845-413f-b1a5-3c3b73e8fc33" (UID: "8ccb8f11-9845-413f-b1a5-3c3b73e8fc33"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:40 crc kubenswrapper[4763]: I1006 15:04:40.976924 4763 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:40 crc kubenswrapper[4763]: I1006 15:04:40.984757 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-kube-api-access-rpzt9" (OuterVolumeSpecName: "kube-api-access-rpzt9") pod "8ccb8f11-9845-413f-b1a5-3c3b73e8fc33" (UID: "8ccb8f11-9845-413f-b1a5-3c3b73e8fc33"). InnerVolumeSpecName "kube-api-access-rpzt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:04:41 crc kubenswrapper[4763]: I1006 15:04:41.001304 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "8ccb8f11-9845-413f-b1a5-3c3b73e8fc33" (UID: "8ccb8f11-9845-413f-b1a5-3c3b73e8fc33"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:04:41 crc kubenswrapper[4763]: I1006 15:04:41.078864 4763 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:41 crc kubenswrapper[4763]: I1006 15:04:41.078914 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpzt9\" (UniqueName: \"kubernetes.io/projected/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33-kube-api-access-rpzt9\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:41 crc kubenswrapper[4763]: I1006 15:04:41.593545 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b7qld" event={"ID":"8ccb8f11-9845-413f-b1a5-3c3b73e8fc33","Type":"ContainerDied","Data":"633f63a2dc4ccee3a756163bc5fc3bcac76f617486c8480d0253e3957a798df7"} Oct 06 15:04:41 crc kubenswrapper[4763]: I1006 15:04:41.593606 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="633f63a2dc4ccee3a756163bc5fc3bcac76f617486c8480d0253e3957a798df7" Oct 06 15:04:41 crc kubenswrapper[4763]: I1006 15:04:41.593706 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b7qld" Oct 06 15:04:45 crc kubenswrapper[4763]: I1006 15:04:45.841636 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jnftg"] Oct 06 15:04:45 crc kubenswrapper[4763]: I1006 15:04:45.842164 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovn-controller" containerID="cri-o://a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd" gracePeriod=30 Oct 06 15:04:45 crc kubenswrapper[4763]: I1006 15:04:45.842456 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="sbdb" containerID="cri-o://cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c" gracePeriod=30 Oct 06 15:04:45 crc kubenswrapper[4763]: I1006 15:04:45.842506 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="nbdb" containerID="cri-o://b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc" gracePeriod=30 Oct 06 15:04:45 crc kubenswrapper[4763]: I1006 15:04:45.842587 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="kube-rbac-proxy-node" containerID="cri-o://8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c" gracePeriod=30 Oct 06 15:04:45 crc kubenswrapper[4763]: I1006 15:04:45.842606 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovn-acl-logging" containerID="cri-o://c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac" gracePeriod=30 Oct 06 15:04:45 crc kubenswrapper[4763]: I1006 15:04:45.842650 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="northd" containerID="cri-o://225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf" gracePeriod=30 Oct 06 15:04:45 crc kubenswrapper[4763]: I1006 15:04:45.842648 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e" gracePeriod=30 Oct 06 15:04:45 crc kubenswrapper[4763]: I1006 15:04:45.880105 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" containerID="cri-o://d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41" gracePeriod=30 Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.198706 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/3.log" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.201636 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovn-acl-logging/0.log" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.202104 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovn-controller/0.log" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.202526 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.274604 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fj6dx"] Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275030 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="kube-rbac-proxy-node" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275071 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="kube-rbac-proxy-node" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275102 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ccb8f11-9845-413f-b1a5-3c3b73e8fc33" containerName="storage" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275119 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ccb8f11-9845-413f-b1a5-3c3b73e8fc33" containerName="storage" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275140 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="sbdb" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275158 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="sbdb" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275178 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275194 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275300 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275320 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275341 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275358 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275379 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275394 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275414 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovn-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275432 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovn-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275454 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="northd" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275469 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="northd" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275489 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovn-acl-logging" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275504 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovn-acl-logging" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275521 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="nbdb" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275537 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="nbdb" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275560 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275575 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275601 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="kubecfg-setup" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275659 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="kubecfg-setup" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.275686 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.275702 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276019 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="northd" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276072 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276096 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="nbdb" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276114 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovn-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276157 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276174 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="sbdb" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276208 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="kube-rbac-proxy-node" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276230 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ccb8f11-9845-413f-b1a5-3c3b73e8fc33" containerName="storage" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276247 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovn-acl-logging" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276264 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276286 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.276930 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.278735 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerName="ovnkube-controller" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.288511 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377490 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-ovn\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377534 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-systemd-units\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377556 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-etc-openvswitch\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377588 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-slash\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377632 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-env-overrides\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377673 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljzf8\" (UniqueName: \"kubernetes.io/projected/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-kube-api-access-ljzf8\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377703 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-openvswitch\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377738 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377768 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-netns\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377771 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377814 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377787 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-netd\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377878 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-bin\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377756 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377846 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-slash" (OuterVolumeSpecName: "host-slash") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377858 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377865 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377851 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377886 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377914 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377922 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-node-log\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378002 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-config\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378021 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-var-lib-openvswitch\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378041 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-ovn-kubernetes\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378063 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovn-node-metrics-cert\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-script-lib\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378120 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-systemd\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378138 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-log-socket\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378156 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-kubelet\") pod \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\" (UID: \"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7\") " Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378311 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-run-netns\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378335 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-systemd-units\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378360 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-var-lib-openvswitch\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378383 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-run-openvswitch\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378406 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-etc-openvswitch\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378425 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-cni-bin\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378443 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f23aeaa1-aed6-4444-89cb-043c9a643130-env-overrides\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378462 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f23aeaa1-aed6-4444-89cb-043c9a643130-ovn-node-metrics-cert\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378481 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-node-log\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378513 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378543 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f23aeaa1-aed6-4444-89cb-043c9a643130-ovnkube-config\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378566 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-run-ovn-kubernetes\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378584 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-run-ovn\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378604 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-log-socket\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378640 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxzzj\" (UniqueName: \"kubernetes.io/projected/f23aeaa1-aed6-4444-89cb-043c9a643130-kube-api-access-sxzzj\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378670 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-kubelet\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.377958 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-node-log" (OuterVolumeSpecName: "node-log") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378693 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-run-systemd\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378664 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378726 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378765 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-cni-netd\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378763 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378810 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-slash\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378845 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378868 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f23aeaa1-aed6-4444-89cb-043c9a643130-ovnkube-script-lib\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378852 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.378908 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-log-socket" (OuterVolumeSpecName: "log-socket") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379278 4763 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379308 4763 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379328 4763 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379334 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379427 4763 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-slash\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379450 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379470 4763 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379498 4763 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379525 4763 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379552 4763 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379577 4763 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379601 4763 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-node-log\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379668 4763 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.379692 4763 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.384267 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.384924 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-kube-api-access-ljzf8" (OuterVolumeSpecName: "kube-api-access-ljzf8") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "kube-api-access-ljzf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.394304 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" (UID: "fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.480970 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-run-ovn-kubernetes\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481040 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-run-ovn\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481082 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-log-socket\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481107 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxzzj\" (UniqueName: \"kubernetes.io/projected/f23aeaa1-aed6-4444-89cb-043c9a643130-kube-api-access-sxzzj\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481151 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-kubelet\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481189 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-run-systemd\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481202 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-run-ovn\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481135 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-run-ovn-kubernetes\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481243 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-log-socket\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-kubelet\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-cni-netd\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481337 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-run-systemd\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481362 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-cni-netd\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481367 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-slash\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481397 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f23aeaa1-aed6-4444-89cb-043c9a643130-ovnkube-script-lib\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481413 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-slash\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481477 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-run-netns\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481501 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-systemd-units\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481531 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-var-lib-openvswitch\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-run-openvswitch\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481580 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-run-netns\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481595 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-etc-openvswitch\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481642 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-etc-openvswitch\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481660 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-cni-bin\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481673 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-run-openvswitch\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481690 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f23aeaa1-aed6-4444-89cb-043c9a643130-env-overrides\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481688 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-var-lib-openvswitch\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481714 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f23aeaa1-aed6-4444-89cb-043c9a643130-ovn-node-metrics-cert\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481778 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-systemd-units\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481704 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-cni-bin\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481879 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-node-log\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481966 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.481977 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-node-log\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482010 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f23aeaa1-aed6-4444-89cb-043c9a643130-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f23aeaa1-aed6-4444-89cb-043c9a643130-ovnkube-config\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482136 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljzf8\" (UniqueName: \"kubernetes.io/projected/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-kube-api-access-ljzf8\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482160 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482179 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482201 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482227 4763 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482252 4763 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-log-socket\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482273 4763 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482368 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f23aeaa1-aed6-4444-89cb-043c9a643130-env-overrides\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.482962 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f23aeaa1-aed6-4444-89cb-043c9a643130-ovnkube-config\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.483537 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f23aeaa1-aed6-4444-89cb-043c9a643130-ovnkube-script-lib\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.485909 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f23aeaa1-aed6-4444-89cb-043c9a643130-ovn-node-metrics-cert\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.502690 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxzzj\" (UniqueName: \"kubernetes.io/projected/f23aeaa1-aed6-4444-89cb-043c9a643130-kube-api-access-sxzzj\") pod \"ovnkube-node-fj6dx\" (UID: \"f23aeaa1-aed6-4444-89cb-043c9a643130\") " pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.606418 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.625583 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/2.log" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.626129 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/1.log" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.626179 4763 generic.go:334] "Generic (PLEG): container finished" podID="22f7ff70-c0ad-406d-aa9d-6824cb935c66" containerID="5749f4dfb5b91f18b41e4f51cd16226f76c1271954a7e8f76f3eda60bcf6cdf7" exitCode=2 Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.626250 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bj6z5" event={"ID":"22f7ff70-c0ad-406d-aa9d-6824cb935c66","Type":"ContainerDied","Data":"5749f4dfb5b91f18b41e4f51cd16226f76c1271954a7e8f76f3eda60bcf6cdf7"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.626289 4763 scope.go:117] "RemoveContainer" containerID="ea5087ca59184c74efb3cb8c7fae183a2593802dbf3c636277053ba8a2d03936" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.627162 4763 scope.go:117] "RemoveContainer" containerID="5749f4dfb5b91f18b41e4f51cd16226f76c1271954a7e8f76f3eda60bcf6cdf7" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.627554 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bj6z5_openshift-multus(22f7ff70-c0ad-406d-aa9d-6824cb935c66)\"" pod="openshift-multus/multus-bj6z5" podUID="22f7ff70-c0ad-406d-aa9d-6824cb935c66" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.631382 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovnkube-controller/3.log" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.634873 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovn-acl-logging/0.log" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.635806 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jnftg_fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/ovn-controller/0.log" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636515 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41" exitCode=0 Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636648 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c" exitCode=0 Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636649 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636704 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636721 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636667 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc" exitCode=0 Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636816 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf" exitCode=0 Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636850 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e" exitCode=0 Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636860 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c" exitCode=0 Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636870 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac" exitCode=143 Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636884 4763 generic.go:334] "Generic (PLEG): container finished" podID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" containerID="a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd" exitCode=143 Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636893 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636941 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636956 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.636964 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.637453 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.637536 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.637637 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.637734 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.637831 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.637920 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638009 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638077 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638140 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638229 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638313 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638397 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638463 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638526 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638586 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638685 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638790 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638857 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638920 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.638982 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639050 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639123 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639195 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639266 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639330 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639392 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639453 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639595 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639697 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639812 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.639917 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.640018 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.640148 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jnftg" event={"ID":"fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7","Type":"ContainerDied","Data":"bf5ae8835061bdd818e8ef4eef71805be6b7f26f6d1e80aa3541a244e616ce0c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.640252 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.640360 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.640491 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.640597 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.640772 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.641000 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.641099 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.641194 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.641287 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd"} Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.641380 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729"} Oct 06 15:04:46 crc kubenswrapper[4763]: W1006 15:04:46.640376 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf23aeaa1_aed6_4444_89cb_043c9a643130.slice/crio-51f8a6ad2ff44d45822dd329439c60ce3eea36669587eeb4b9394381866fc21c WatchSource:0}: Error finding container 51f8a6ad2ff44d45822dd329439c60ce3eea36669587eeb4b9394381866fc21c: Status 404 returned error can't find the container with id 51f8a6ad2ff44d45822dd329439c60ce3eea36669587eeb4b9394381866fc21c Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.677511 4763 scope.go:117] "RemoveContainer" containerID="d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.690403 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jnftg"] Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.694856 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jnftg"] Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.711706 4763 scope.go:117] "RemoveContainer" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.733931 4763 scope.go:117] "RemoveContainer" containerID="cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.762533 4763 scope.go:117] "RemoveContainer" containerID="b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.777809 4763 scope.go:117] "RemoveContainer" containerID="225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.798769 4763 scope.go:117] "RemoveContainer" containerID="b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.870549 4763 scope.go:117] "RemoveContainer" containerID="8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.884180 4763 scope.go:117] "RemoveContainer" containerID="c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.894974 4763 scope.go:117] "RemoveContainer" containerID="a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.906426 4763 scope.go:117] "RemoveContainer" containerID="2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.918841 4763 scope.go:117] "RemoveContainer" containerID="d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.919489 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41\": container with ID starting with d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41 not found: ID does not exist" containerID="d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.919523 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41"} err="failed to get container status \"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41\": rpc error: code = NotFound desc = could not find container \"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41\": container with ID starting with d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.919544 4763 scope.go:117] "RemoveContainer" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.919955 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\": container with ID starting with a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824 not found: ID does not exist" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.919977 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824"} err="failed to get container status \"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\": rpc error: code = NotFound desc = could not find container \"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\": container with ID starting with a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.919991 4763 scope.go:117] "RemoveContainer" containerID="cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.920311 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\": container with ID starting with cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c not found: ID does not exist" containerID="cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.920329 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c"} err="failed to get container status \"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\": rpc error: code = NotFound desc = could not find container \"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\": container with ID starting with cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.920342 4763 scope.go:117] "RemoveContainer" containerID="b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.920771 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\": container with ID starting with b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc not found: ID does not exist" containerID="b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.920813 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc"} err="failed to get container status \"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\": rpc error: code = NotFound desc = could not find container \"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\": container with ID starting with b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.920860 4763 scope.go:117] "RemoveContainer" containerID="225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.921200 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\": container with ID starting with 225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf not found: ID does not exist" containerID="225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.921222 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf"} err="failed to get container status \"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\": rpc error: code = NotFound desc = could not find container \"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\": container with ID starting with 225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.921256 4763 scope.go:117] "RemoveContainer" containerID="b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.921572 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\": container with ID starting with b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e not found: ID does not exist" containerID="b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.921591 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e"} err="failed to get container status \"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\": rpc error: code = NotFound desc = could not find container \"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\": container with ID starting with b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.921603 4763 scope.go:117] "RemoveContainer" containerID="8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.921854 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\": container with ID starting with 8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c not found: ID does not exist" containerID="8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.921873 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c"} err="failed to get container status \"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\": rpc error: code = NotFound desc = could not find container \"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\": container with ID starting with 8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.921885 4763 scope.go:117] "RemoveContainer" containerID="c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.922254 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\": container with ID starting with c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac not found: ID does not exist" containerID="c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.922318 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac"} err="failed to get container status \"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\": rpc error: code = NotFound desc = could not find container \"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\": container with ID starting with c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.922362 4763 scope.go:117] "RemoveContainer" containerID="a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.922746 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\": container with ID starting with a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd not found: ID does not exist" containerID="a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.922843 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd"} err="failed to get container status \"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\": rpc error: code = NotFound desc = could not find container \"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\": container with ID starting with a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.922871 4763 scope.go:117] "RemoveContainer" containerID="2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729" Oct 06 15:04:46 crc kubenswrapper[4763]: E1006 15:04:46.923311 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\": container with ID starting with 2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729 not found: ID does not exist" containerID="2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.923373 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729"} err="failed to get container status \"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\": rpc error: code = NotFound desc = could not find container \"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\": container with ID starting with 2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.923396 4763 scope.go:117] "RemoveContainer" containerID="d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.923822 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41"} err="failed to get container status \"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41\": rpc error: code = NotFound desc = could not find container \"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41\": container with ID starting with d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.923845 4763 scope.go:117] "RemoveContainer" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.924208 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824"} err="failed to get container status \"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\": rpc error: code = NotFound desc = could not find container \"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\": container with ID starting with a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.924232 4763 scope.go:117] "RemoveContainer" containerID="cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.924647 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c"} err="failed to get container status \"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\": rpc error: code = NotFound desc = could not find container \"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\": container with ID starting with cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.924686 4763 scope.go:117] "RemoveContainer" containerID="b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.925221 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc"} err="failed to get container status \"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\": rpc error: code = NotFound desc = could not find container \"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\": container with ID starting with b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.925273 4763 scope.go:117] "RemoveContainer" containerID="225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.925729 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf"} err="failed to get container status \"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\": rpc error: code = NotFound desc = could not find container \"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\": container with ID starting with 225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.925749 4763 scope.go:117] "RemoveContainer" containerID="b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.926088 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e"} err="failed to get container status \"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\": rpc error: code = NotFound desc = could not find container \"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\": container with ID starting with b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.926118 4763 scope.go:117] "RemoveContainer" containerID="8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.926475 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c"} err="failed to get container status \"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\": rpc error: code = NotFound desc = could not find container \"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\": container with ID starting with 8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.926511 4763 scope.go:117] "RemoveContainer" containerID="c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.926904 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac"} err="failed to get container status \"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\": rpc error: code = NotFound desc = could not find container \"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\": container with ID starting with c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.926927 4763 scope.go:117] "RemoveContainer" containerID="a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.927243 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd"} err="failed to get container status \"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\": rpc error: code = NotFound desc = could not find container \"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\": container with ID starting with a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.927266 4763 scope.go:117] "RemoveContainer" containerID="2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.927566 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729"} err="failed to get container status \"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\": rpc error: code = NotFound desc = could not find container \"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\": container with ID starting with 2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.927585 4763 scope.go:117] "RemoveContainer" containerID="d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.927917 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41"} err="failed to get container status \"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41\": rpc error: code = NotFound desc = could not find container \"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41\": container with ID starting with d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.927942 4763 scope.go:117] "RemoveContainer" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.928192 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824"} err="failed to get container status \"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\": rpc error: code = NotFound desc = could not find container \"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\": container with ID starting with a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.928208 4763 scope.go:117] "RemoveContainer" containerID="cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.928416 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c"} err="failed to get container status \"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\": rpc error: code = NotFound desc = could not find container \"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\": container with ID starting with cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.928445 4763 scope.go:117] "RemoveContainer" containerID="b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.928809 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc"} err="failed to get container status \"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\": rpc error: code = NotFound desc = could not find container \"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\": container with ID starting with b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.928829 4763 scope.go:117] "RemoveContainer" containerID="225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.929240 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf"} err="failed to get container status \"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\": rpc error: code = NotFound desc = could not find container \"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\": container with ID starting with 225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.929266 4763 scope.go:117] "RemoveContainer" containerID="b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.929546 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e"} err="failed to get container status \"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\": rpc error: code = NotFound desc = could not find container \"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\": container with ID starting with b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.929567 4763 scope.go:117] "RemoveContainer" containerID="8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.929804 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c"} err="failed to get container status \"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\": rpc error: code = NotFound desc = could not find container \"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\": container with ID starting with 8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.929823 4763 scope.go:117] "RemoveContainer" containerID="c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.930152 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac"} err="failed to get container status \"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\": rpc error: code = NotFound desc = could not find container \"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\": container with ID starting with c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.930174 4763 scope.go:117] "RemoveContainer" containerID="a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.930401 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd"} err="failed to get container status \"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\": rpc error: code = NotFound desc = could not find container \"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\": container with ID starting with a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.930423 4763 scope.go:117] "RemoveContainer" containerID="2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.930702 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729"} err="failed to get container status \"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\": rpc error: code = NotFound desc = could not find container \"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\": container with ID starting with 2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.930727 4763 scope.go:117] "RemoveContainer" containerID="d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.931048 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41"} err="failed to get container status \"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41\": rpc error: code = NotFound desc = could not find container \"d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41\": container with ID starting with d64cd6c8f4d46e79d930646a76fcba67cc38202b7cf3bbf6be5db6b5e5ddca41 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.931081 4763 scope.go:117] "RemoveContainer" containerID="a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.931492 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824"} err="failed to get container status \"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\": rpc error: code = NotFound desc = could not find container \"a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824\": container with ID starting with a1dd1f081506b320485cee379cf23ff3d58f5546e29280125a31d0b1a3b05824 not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.931512 4763 scope.go:117] "RemoveContainer" containerID="cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.931827 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c"} err="failed to get container status \"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\": rpc error: code = NotFound desc = could not find container \"cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c\": container with ID starting with cf4f85b50c87e687c4b3d6af96c63647f614931caf444f1873d45f0a2913690c not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.931845 4763 scope.go:117] "RemoveContainer" containerID="b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.932080 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc"} err="failed to get container status \"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\": rpc error: code = NotFound desc = could not find container \"b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc\": container with ID starting with b6efd63bba745bc67e4e96513763c8676957f623460bc258dacebd498f85e7bc not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.932098 4763 scope.go:117] "RemoveContainer" containerID="225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.932373 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf"} err="failed to get container status \"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\": rpc error: code = NotFound desc = could not find container \"225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf\": container with ID starting with 225998ba0e7097339a956927039ac3187cf5a5117ec4f543e6a46e3b67d10bcf not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.932388 4763 scope.go:117] "RemoveContainer" containerID="b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.932689 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e"} err="failed to get container status \"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\": rpc error: code = NotFound desc = could not find container \"b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e\": container with ID starting with b098be7b527c34108c7b9ae5c25dbe079b00f8fa98615deb4f925ec4ab31507e not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.932705 4763 scope.go:117] "RemoveContainer" containerID="8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.933048 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c"} err="failed to get container status \"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\": rpc error: code = NotFound desc = could not find container \"8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c\": container with ID starting with 8e95b53ec3637970fb3c54dd04f863429d18641108c13b8d7dfba765b1713e8c not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.933070 4763 scope.go:117] "RemoveContainer" containerID="c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.933438 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac"} err="failed to get container status \"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\": rpc error: code = NotFound desc = could not find container \"c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac\": container with ID starting with c85f4f5efb6016635728e3625c7e9fda4ae5b15562980b2c50c22270d1a05eac not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.933459 4763 scope.go:117] "RemoveContainer" containerID="a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.933829 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd"} err="failed to get container status \"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\": rpc error: code = NotFound desc = could not find container \"a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd\": container with ID starting with a9f095fde12f02603a48ebd3740089a2a2b8d0a187f311ef974eadcf2565abcd not found: ID does not exist" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.933848 4763 scope.go:117] "RemoveContainer" containerID="2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729" Oct 06 15:04:46 crc kubenswrapper[4763]: I1006 15:04:46.934223 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729"} err="failed to get container status \"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\": rpc error: code = NotFound desc = could not find container \"2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729\": container with ID starting with 2f4c87e4be03913667ab177385fcf0d61173abe2148db53268bb65f70c9c6729 not found: ID does not exist" Oct 06 15:04:47 crc kubenswrapper[4763]: I1006 15:04:47.583084 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7" path="/var/lib/kubelet/pods/fe8cb5e2-cf64-4ca2-a8b2-6044db6de7c7/volumes" Oct 06 15:04:47 crc kubenswrapper[4763]: I1006 15:04:47.648478 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/2.log" Oct 06 15:04:47 crc kubenswrapper[4763]: I1006 15:04:47.650072 4763 generic.go:334] "Generic (PLEG): container finished" podID="f23aeaa1-aed6-4444-89cb-043c9a643130" containerID="b0da0ca6a48607c32aa48fa462d0b62e90c51eebec6d0d3d77f85ac1a5779802" exitCode=0 Oct 06 15:04:47 crc kubenswrapper[4763]: I1006 15:04:47.650130 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" event={"ID":"f23aeaa1-aed6-4444-89cb-043c9a643130","Type":"ContainerDied","Data":"b0da0ca6a48607c32aa48fa462d0b62e90c51eebec6d0d3d77f85ac1a5779802"} Oct 06 15:04:47 crc kubenswrapper[4763]: I1006 15:04:47.650165 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" event={"ID":"f23aeaa1-aed6-4444-89cb-043c9a643130","Type":"ContainerStarted","Data":"51f8a6ad2ff44d45822dd329439c60ce3eea36669587eeb4b9394381866fc21c"} Oct 06 15:04:48 crc kubenswrapper[4763]: I1006 15:04:48.657860 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" event={"ID":"f23aeaa1-aed6-4444-89cb-043c9a643130","Type":"ContainerStarted","Data":"59b60a43f1c72b24e111556c8cbe18010077b65aa729fb7e96e639f56fbad4ac"} Oct 06 15:04:48 crc kubenswrapper[4763]: I1006 15:04:48.658381 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" event={"ID":"f23aeaa1-aed6-4444-89cb-043c9a643130","Type":"ContainerStarted","Data":"63c0c781856f508033b6508a8e809e136770c2e6ae7b608555ec07b8ef8a0b8e"} Oct 06 15:04:48 crc kubenswrapper[4763]: I1006 15:04:48.658394 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" event={"ID":"f23aeaa1-aed6-4444-89cb-043c9a643130","Type":"ContainerStarted","Data":"f3fb687d126f212578eb69574a31904a30641df100475a4ea0a0f0ca359501b7"} Oct 06 15:04:48 crc kubenswrapper[4763]: I1006 15:04:48.658403 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" event={"ID":"f23aeaa1-aed6-4444-89cb-043c9a643130","Type":"ContainerStarted","Data":"6ad847062a18125f98976801be1dd52325a836c794c2ad393a901e4fb7fd004e"} Oct 06 15:04:48 crc kubenswrapper[4763]: I1006 15:04:48.658410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" event={"ID":"f23aeaa1-aed6-4444-89cb-043c9a643130","Type":"ContainerStarted","Data":"d3131e15649e01a1dd6ae635aff7651b1229d93d87602c6d0feda80497f96bac"} Oct 06 15:04:48 crc kubenswrapper[4763]: I1006 15:04:48.658419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" event={"ID":"f23aeaa1-aed6-4444-89cb-043c9a643130","Type":"ContainerStarted","Data":"619eede98459f1f32017a19263563549717c994d041d67c33fb19d75ce8edc45"} Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.405197 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl"] Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.406805 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.409083 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.433048 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrhqp\" (UniqueName: \"kubernetes.io/projected/04a1458d-6c4f-4a7a-8394-b10333d04d20-kube-api-access-nrhqp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.433206 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.433297 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.534416 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.534531 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.534646 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrhqp\" (UniqueName: \"kubernetes.io/projected/04a1458d-6c4f-4a7a-8394-b10333d04d20-kube-api-access-nrhqp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.535195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.535332 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.569094 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrhqp\" (UniqueName: \"kubernetes.io/projected/04a1458d-6c4f-4a7a-8394-b10333d04d20-kube-api-access-nrhqp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: I1006 15:04:49.732779 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: E1006 15:04:49.761111 4763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(8a39e689039a2fcfec5c80f185a8d4d9e6b93f1e5200b600bf174e1cedac4810): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 15:04:49 crc kubenswrapper[4763]: E1006 15:04:49.761239 4763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(8a39e689039a2fcfec5c80f185a8d4d9e6b93f1e5200b600bf174e1cedac4810): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: E1006 15:04:49.761314 4763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(8a39e689039a2fcfec5c80f185a8d4d9e6b93f1e5200b600bf174e1cedac4810): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:49 crc kubenswrapper[4763]: E1006 15:04:49.761427 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace(04a1458d-6c4f-4a7a-8394-b10333d04d20)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace(04a1458d-6c4f-4a7a-8394-b10333d04d20)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(8a39e689039a2fcfec5c80f185a8d4d9e6b93f1e5200b600bf174e1cedac4810): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" podUID="04a1458d-6c4f-4a7a-8394-b10333d04d20" Oct 06 15:04:50 crc kubenswrapper[4763]: I1006 15:04:50.682738 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" event={"ID":"f23aeaa1-aed6-4444-89cb-043c9a643130","Type":"ContainerStarted","Data":"3fe003ddba09f1afb8af5d80b7f462a72890ff0992f0215003814d964dfcd631"} Oct 06 15:04:53 crc kubenswrapper[4763]: I1006 15:04:53.709836 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" event={"ID":"f23aeaa1-aed6-4444-89cb-043c9a643130","Type":"ContainerStarted","Data":"1fcc399de2b954b9c8d395e24b3788fc0be5637752d647f32c691e4e7ac740c2"} Oct 06 15:04:53 crc kubenswrapper[4763]: I1006 15:04:53.711801 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:53 crc kubenswrapper[4763]: I1006 15:04:53.711824 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:53 crc kubenswrapper[4763]: I1006 15:04:53.711837 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:53 crc kubenswrapper[4763]: I1006 15:04:53.777839 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl"] Oct 06 15:04:53 crc kubenswrapper[4763]: I1006 15:04:53.778029 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:53 crc kubenswrapper[4763]: I1006 15:04:53.778603 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:53 crc kubenswrapper[4763]: I1006 15:04:53.778950 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:53 crc kubenswrapper[4763]: I1006 15:04:53.780806 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" podStartSLOduration=7.780784508 podStartE2EDuration="7.780784508s" podCreationTimestamp="2025-10-06 15:04:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:04:53.761818733 +0000 UTC m=+690.917111355" watchObservedRunningTime="2025-10-06 15:04:53.780784508 +0000 UTC m=+690.936077030" Oct 06 15:04:53 crc kubenswrapper[4763]: I1006 15:04:53.790960 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:04:53 crc kubenswrapper[4763]: E1006 15:04:53.806415 4763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(b055bad341006b52f4020b763232fe27449a34bc2d84991b0b6d7c062db129e2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 15:04:53 crc kubenswrapper[4763]: E1006 15:04:53.806926 4763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(b055bad341006b52f4020b763232fe27449a34bc2d84991b0b6d7c062db129e2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:53 crc kubenswrapper[4763]: E1006 15:04:53.807074 4763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(b055bad341006b52f4020b763232fe27449a34bc2d84991b0b6d7c062db129e2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:04:53 crc kubenswrapper[4763]: E1006 15:04:53.807225 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace(04a1458d-6c4f-4a7a-8394-b10333d04d20)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace(04a1458d-6c4f-4a7a-8394-b10333d04d20)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(b055bad341006b52f4020b763232fe27449a34bc2d84991b0b6d7c062db129e2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" podUID="04a1458d-6c4f-4a7a-8394-b10333d04d20" Oct 06 15:04:58 crc kubenswrapper[4763]: I1006 15:04:58.575096 4763 scope.go:117] "RemoveContainer" containerID="5749f4dfb5b91f18b41e4f51cd16226f76c1271954a7e8f76f3eda60bcf6cdf7" Oct 06 15:04:58 crc kubenswrapper[4763]: E1006 15:04:58.575441 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bj6z5_openshift-multus(22f7ff70-c0ad-406d-aa9d-6824cb935c66)\"" pod="openshift-multus/multus-bj6z5" podUID="22f7ff70-c0ad-406d-aa9d-6824cb935c66" Oct 06 15:05:05 crc kubenswrapper[4763]: I1006 15:05:05.575014 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:05:05 crc kubenswrapper[4763]: I1006 15:05:05.576599 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:05:05 crc kubenswrapper[4763]: E1006 15:05:05.614328 4763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(750b02941a2226e3e6d2961957f8bdfd73afedd8cb1a454a8167c4eac7a2d497): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 15:05:05 crc kubenswrapper[4763]: E1006 15:05:05.614423 4763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(750b02941a2226e3e6d2961957f8bdfd73afedd8cb1a454a8167c4eac7a2d497): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:05:05 crc kubenswrapper[4763]: E1006 15:05:05.614459 4763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(750b02941a2226e3e6d2961957f8bdfd73afedd8cb1a454a8167c4eac7a2d497): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:05:05 crc kubenswrapper[4763]: E1006 15:05:05.614659 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace(04a1458d-6c4f-4a7a-8394-b10333d04d20)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace(04a1458d-6c4f-4a7a-8394-b10333d04d20)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_openshift-marketplace_04a1458d-6c4f-4a7a-8394-b10333d04d20_0(750b02941a2226e3e6d2961957f8bdfd73afedd8cb1a454a8167c4eac7a2d497): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" podUID="04a1458d-6c4f-4a7a-8394-b10333d04d20" Oct 06 15:05:13 crc kubenswrapper[4763]: I1006 15:05:13.580587 4763 scope.go:117] "RemoveContainer" containerID="5749f4dfb5b91f18b41e4f51cd16226f76c1271954a7e8f76f3eda60bcf6cdf7" Oct 06 15:05:13 crc kubenswrapper[4763]: I1006 15:05:13.879079 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/2.log" Oct 06 15:05:14 crc kubenswrapper[4763]: I1006 15:05:14.889709 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bj6z5_22f7ff70-c0ad-406d-aa9d-6824cb935c66/kube-multus/2.log" Oct 06 15:05:14 crc kubenswrapper[4763]: I1006 15:05:14.890157 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bj6z5" event={"ID":"22f7ff70-c0ad-406d-aa9d-6824cb935c66","Type":"ContainerStarted","Data":"a3cd8df37d7c25d23ee61ec770dd18da073719a66f2b7504815c0a08630cf088"} Oct 06 15:05:16 crc kubenswrapper[4763]: I1006 15:05:16.641530 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fj6dx" Oct 06 15:05:18 crc kubenswrapper[4763]: I1006 15:05:18.575109 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:05:18 crc kubenswrapper[4763]: I1006 15:05:18.575803 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:05:18 crc kubenswrapper[4763]: I1006 15:05:18.816316 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl"] Oct 06 15:05:18 crc kubenswrapper[4763]: W1006 15:05:18.829869 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a1458d_6c4f_4a7a_8394_b10333d04d20.slice/crio-e17d61b1198ab6fbc253283baf51e1dde9b498a372da5e90edd694f911f338ff WatchSource:0}: Error finding container e17d61b1198ab6fbc253283baf51e1dde9b498a372da5e90edd694f911f338ff: Status 404 returned error can't find the container with id e17d61b1198ab6fbc253283baf51e1dde9b498a372da5e90edd694f911f338ff Oct 06 15:05:18 crc kubenswrapper[4763]: I1006 15:05:18.915096 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" event={"ID":"04a1458d-6c4f-4a7a-8394-b10333d04d20","Type":"ContainerStarted","Data":"e17d61b1198ab6fbc253283baf51e1dde9b498a372da5e90edd694f911f338ff"} Oct 06 15:05:19 crc kubenswrapper[4763]: I1006 15:05:19.924600 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" event={"ID":"04a1458d-6c4f-4a7a-8394-b10333d04d20","Type":"ContainerStarted","Data":"2fdead177b635d627916dedac07c3648523fb4d772cd551172a7d8bd7789e508"} Oct 06 15:05:20 crc kubenswrapper[4763]: I1006 15:05:20.934455 4763 generic.go:334] "Generic (PLEG): container finished" podID="04a1458d-6c4f-4a7a-8394-b10333d04d20" containerID="2fdead177b635d627916dedac07c3648523fb4d772cd551172a7d8bd7789e508" exitCode=0 Oct 06 15:05:20 crc kubenswrapper[4763]: I1006 15:05:20.934539 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" event={"ID":"04a1458d-6c4f-4a7a-8394-b10333d04d20","Type":"ContainerDied","Data":"2fdead177b635d627916dedac07c3648523fb4d772cd551172a7d8bd7789e508"} Oct 06 15:05:23 crc kubenswrapper[4763]: I1006 15:05:23.967291 4763 generic.go:334] "Generic (PLEG): container finished" podID="04a1458d-6c4f-4a7a-8394-b10333d04d20" containerID="224f929ab87dbf5adff9ba40cc97abb0c102783a2c8287f652728fd90f8c39ba" exitCode=0 Oct 06 15:05:23 crc kubenswrapper[4763]: I1006 15:05:23.967333 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" event={"ID":"04a1458d-6c4f-4a7a-8394-b10333d04d20","Type":"ContainerDied","Data":"224f929ab87dbf5adff9ba40cc97abb0c102783a2c8287f652728fd90f8c39ba"} Oct 06 15:05:24 crc kubenswrapper[4763]: I1006 15:05:24.979699 4763 generic.go:334] "Generic (PLEG): container finished" podID="04a1458d-6c4f-4a7a-8394-b10333d04d20" containerID="4c2e9def4cab830d7dbb30817d8a1af1860ea7d2231bf6d94bdbc9cc6af26d82" exitCode=0 Oct 06 15:05:24 crc kubenswrapper[4763]: I1006 15:05:24.979761 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" event={"ID":"04a1458d-6c4f-4a7a-8394-b10333d04d20","Type":"ContainerDied","Data":"4c2e9def4cab830d7dbb30817d8a1af1860ea7d2231bf6d94bdbc9cc6af26d82"} Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.268953 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.373444 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-util\") pod \"04a1458d-6c4f-4a7a-8394-b10333d04d20\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.373548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrhqp\" (UniqueName: \"kubernetes.io/projected/04a1458d-6c4f-4a7a-8394-b10333d04d20-kube-api-access-nrhqp\") pod \"04a1458d-6c4f-4a7a-8394-b10333d04d20\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.373675 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-bundle\") pod \"04a1458d-6c4f-4a7a-8394-b10333d04d20\" (UID: \"04a1458d-6c4f-4a7a-8394-b10333d04d20\") " Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.375119 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-bundle" (OuterVolumeSpecName: "bundle") pod "04a1458d-6c4f-4a7a-8394-b10333d04d20" (UID: "04a1458d-6c4f-4a7a-8394-b10333d04d20"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.382105 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a1458d-6c4f-4a7a-8394-b10333d04d20-kube-api-access-nrhqp" (OuterVolumeSpecName: "kube-api-access-nrhqp") pod "04a1458d-6c4f-4a7a-8394-b10333d04d20" (UID: "04a1458d-6c4f-4a7a-8394-b10333d04d20"). InnerVolumeSpecName "kube-api-access-nrhqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.394301 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-util" (OuterVolumeSpecName: "util") pod "04a1458d-6c4f-4a7a-8394-b10333d04d20" (UID: "04a1458d-6c4f-4a7a-8394-b10333d04d20"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.476040 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-util\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.476118 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrhqp\" (UniqueName: \"kubernetes.io/projected/04a1458d-6c4f-4a7a-8394-b10333d04d20-kube-api-access-nrhqp\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.476674 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04a1458d-6c4f-4a7a-8394-b10333d04d20-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.999358 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" event={"ID":"04a1458d-6c4f-4a7a-8394-b10333d04d20","Type":"ContainerDied","Data":"e17d61b1198ab6fbc253283baf51e1dde9b498a372da5e90edd694f911f338ff"} Oct 06 15:05:26 crc kubenswrapper[4763]: I1006 15:05:26.999703 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e17d61b1198ab6fbc253283baf51e1dde9b498a372da5e90edd694f911f338ff" Oct 06 15:05:27 crc kubenswrapper[4763]: I1006 15:05:26.999752 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.200360 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5plp9"] Oct 06 15:05:31 crc kubenswrapper[4763]: E1006 15:05:31.200582 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a1458d-6c4f-4a7a-8394-b10333d04d20" containerName="extract" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.200596 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a1458d-6c4f-4a7a-8394-b10333d04d20" containerName="extract" Oct 06 15:05:31 crc kubenswrapper[4763]: E1006 15:05:31.200610 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a1458d-6c4f-4a7a-8394-b10333d04d20" containerName="util" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.200663 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a1458d-6c4f-4a7a-8394-b10333d04d20" containerName="util" Oct 06 15:05:31 crc kubenswrapper[4763]: E1006 15:05:31.200677 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a1458d-6c4f-4a7a-8394-b10333d04d20" containerName="pull" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.200685 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a1458d-6c4f-4a7a-8394-b10333d04d20" containerName="pull" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.200804 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a1458d-6c4f-4a7a-8394-b10333d04d20" containerName="extract" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.201232 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5plp9" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.203445 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.203597 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7gnhd" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.203764 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.211776 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5plp9"] Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.341021 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tndmg\" (UniqueName: \"kubernetes.io/projected/1e655810-1b15-490a-ac92-54845114c6f6-kube-api-access-tndmg\") pod \"nmstate-operator-858ddd8f98-5plp9\" (UID: \"1e655810-1b15-490a-ac92-54845114c6f6\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5plp9" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.442717 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tndmg\" (UniqueName: \"kubernetes.io/projected/1e655810-1b15-490a-ac92-54845114c6f6-kube-api-access-tndmg\") pod \"nmstate-operator-858ddd8f98-5plp9\" (UID: \"1e655810-1b15-490a-ac92-54845114c6f6\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5plp9" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.465717 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tndmg\" (UniqueName: \"kubernetes.io/projected/1e655810-1b15-490a-ac92-54845114c6f6-kube-api-access-tndmg\") pod \"nmstate-operator-858ddd8f98-5plp9\" (UID: \"1e655810-1b15-490a-ac92-54845114c6f6\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5plp9" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.524729 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5plp9" Oct 06 15:05:31 crc kubenswrapper[4763]: I1006 15:05:31.779313 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5plp9"] Oct 06 15:05:32 crc kubenswrapper[4763]: I1006 15:05:32.030957 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5plp9" event={"ID":"1e655810-1b15-490a-ac92-54845114c6f6","Type":"ContainerStarted","Data":"5f44def7f12aa3d41bb57eab4e579f22881110ef470038e318f88770f392b068"} Oct 06 15:05:35 crc kubenswrapper[4763]: I1006 15:05:35.054909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5plp9" event={"ID":"1e655810-1b15-490a-ac92-54845114c6f6","Type":"ContainerStarted","Data":"e14b795161b255794f4d206357a775963b7ff5215ef41bbecaf3084ea4c171e8"} Oct 06 15:05:35 crc kubenswrapper[4763]: I1006 15:05:35.089355 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5plp9" podStartSLOduration=1.985189772 podStartE2EDuration="4.089326622s" podCreationTimestamp="2025-10-06 15:05:31 +0000 UTC" firstStartedPulling="2025-10-06 15:05:31.789869428 +0000 UTC m=+728.945161940" lastFinishedPulling="2025-10-06 15:05:33.894006278 +0000 UTC m=+731.049298790" observedRunningTime="2025-10-06 15:05:35.081224664 +0000 UTC m=+732.236517206" watchObservedRunningTime="2025-10-06 15:05:35.089326622 +0000 UTC m=+732.244619164" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.554313 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn"] Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.555571 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.559197 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-dtlkd" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.566532 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc"] Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.567232 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.569205 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.576038 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn"] Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.583199 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc"] Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.597110 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-s4ftg"] Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.597767 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.678012 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zgn6\" (UniqueName: \"kubernetes.io/projected/a8dc2d16-42c4-438c-930b-c6de9707aa93-kube-api-access-6zgn6\") pod \"nmstate-webhook-6cdbc54649-vvbkc\" (UID: \"a8dc2d16-42c4-438c-930b-c6de9707aa93\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.678131 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265gm\" (UniqueName: \"kubernetes.io/projected/5f21f303-50e0-4f36-aa64-53c5ae4f27c0-kube-api-access-265gm\") pod \"nmstate-metrics-fdff9cb8d-kvgpn\" (UID: \"5f21f303-50e0-4f36-aa64-53c5ae4f27c0\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.678167 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8dc2d16-42c4-438c-930b-c6de9707aa93-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-vvbkc\" (UID: \"a8dc2d16-42c4-438c-930b-c6de9707aa93\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.697188 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9"] Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.697964 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.699946 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.700139 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.703020 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9zr7f" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.704797 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9"] Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.779294 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-nmstate-lock\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.779347 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-265gm\" (UniqueName: \"kubernetes.io/projected/5f21f303-50e0-4f36-aa64-53c5ae4f27c0-kube-api-access-265gm\") pod \"nmstate-metrics-fdff9cb8d-kvgpn\" (UID: \"5f21f303-50e0-4f36-aa64-53c5ae4f27c0\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.779374 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8dc2d16-42c4-438c-930b-c6de9707aa93-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-vvbkc\" (UID: \"a8dc2d16-42c4-438c-930b-c6de9707aa93\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.779409 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-ovs-socket\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.779428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-dbus-socket\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.779465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zgn6\" (UniqueName: \"kubernetes.io/projected/a8dc2d16-42c4-438c-930b-c6de9707aa93-kube-api-access-6zgn6\") pod \"nmstate-webhook-6cdbc54649-vvbkc\" (UID: \"a8dc2d16-42c4-438c-930b-c6de9707aa93\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.779484 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsnkd\" (UniqueName: \"kubernetes.io/projected/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-kube-api-access-dsnkd\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: E1006 15:05:40.779566 4763 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 06 15:05:40 crc kubenswrapper[4763]: E1006 15:05:40.779667 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8dc2d16-42c4-438c-930b-c6de9707aa93-tls-key-pair podName:a8dc2d16-42c4-438c-930b-c6de9707aa93 nodeName:}" failed. No retries permitted until 2025-10-06 15:05:41.27964956 +0000 UTC m=+738.434942072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/a8dc2d16-42c4-438c-930b-c6de9707aa93-tls-key-pair") pod "nmstate-webhook-6cdbc54649-vvbkc" (UID: "a8dc2d16-42c4-438c-930b-c6de9707aa93") : secret "openshift-nmstate-webhook" not found Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.803668 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-265gm\" (UniqueName: \"kubernetes.io/projected/5f21f303-50e0-4f36-aa64-53c5ae4f27c0-kube-api-access-265gm\") pod \"nmstate-metrics-fdff9cb8d-kvgpn\" (UID: \"5f21f303-50e0-4f36-aa64-53c5ae4f27c0\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.809259 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zgn6\" (UniqueName: \"kubernetes.io/projected/a8dc2d16-42c4-438c-930b-c6de9707aa93-kube-api-access-6zgn6\") pod \"nmstate-webhook-6cdbc54649-vvbkc\" (UID: \"a8dc2d16-42c4-438c-930b-c6de9707aa93\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.880442 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nsqb\" (UniqueName: \"kubernetes.io/projected/9f18885b-ea4b-4790-9482-0d327c1873b3-kube-api-access-9nsqb\") pod \"nmstate-console-plugin-6b874cbd85-nbrs9\" (UID: \"9f18885b-ea4b-4790-9482-0d327c1873b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.880496 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-nmstate-lock\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.880544 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-ovs-socket\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.880564 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-dbus-socket\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.880589 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f18885b-ea4b-4790-9482-0d327c1873b3-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-nbrs9\" (UID: \"9f18885b-ea4b-4790-9482-0d327c1873b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.880631 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsnkd\" (UniqueName: \"kubernetes.io/projected/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-kube-api-access-dsnkd\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.880642 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-nmstate-lock\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.880652 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f18885b-ea4b-4790-9482-0d327c1873b3-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-nbrs9\" (UID: \"9f18885b-ea4b-4790-9482-0d327c1873b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.880788 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-ovs-socket\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.880943 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-dbus-socket\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.889678 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.902203 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsnkd\" (UniqueName: \"kubernetes.io/projected/6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef-kube-api-access-dsnkd\") pod \"nmstate-handler-s4ftg\" (UID: \"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef\") " pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.922919 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.981984 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f18885b-ea4b-4790-9482-0d327c1873b3-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-nbrs9\" (UID: \"9f18885b-ea4b-4790-9482-0d327c1873b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.982050 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f18885b-ea4b-4790-9482-0d327c1873b3-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-nbrs9\" (UID: \"9f18885b-ea4b-4790-9482-0d327c1873b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.982082 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nsqb\" (UniqueName: \"kubernetes.io/projected/9f18885b-ea4b-4790-9482-0d327c1873b3-kube-api-access-9nsqb\") pod \"nmstate-console-plugin-6b874cbd85-nbrs9\" (UID: \"9f18885b-ea4b-4790-9482-0d327c1873b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.983477 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f18885b-ea4b-4790-9482-0d327c1873b3-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-nbrs9\" (UID: \"9f18885b-ea4b-4790-9482-0d327c1873b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:40 crc kubenswrapper[4763]: I1006 15:05:40.987594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f18885b-ea4b-4790-9482-0d327c1873b3-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-nbrs9\" (UID: \"9f18885b-ea4b-4790-9482-0d327c1873b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.004779 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nsqb\" (UniqueName: \"kubernetes.io/projected/9f18885b-ea4b-4790-9482-0d327c1873b3-kube-api-access-9nsqb\") pod \"nmstate-console-plugin-6b874cbd85-nbrs9\" (UID: \"9f18885b-ea4b-4790-9482-0d327c1873b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.013738 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.058095 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-75f565c5d8-mmxgw"] Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.058823 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.086205 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75f565c5d8-mmxgw"] Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.101064 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s4ftg" event={"ID":"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef","Type":"ContainerStarted","Data":"d375b3dd8c4abfbf6362e70f0f68d50b9d439a9e3248bc85bbd98bb9197f1c9c"} Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.185544 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e37f410-52bf-4a62-86eb-f7cfa1659873-console-oauth-config\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.185585 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g48v\" (UniqueName: \"kubernetes.io/projected/8e37f410-52bf-4a62-86eb-f7cfa1659873-kube-api-access-6g48v\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.185604 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e37f410-52bf-4a62-86eb-f7cfa1659873-console-serving-cert\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.185636 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-console-config\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.185655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-service-ca\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.185676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-oauth-serving-cert\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.185710 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-trusted-ca-bundle\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.214131 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9"] Oct 06 15:05:41 crc kubenswrapper[4763]: W1006 15:05:41.220956 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f18885b_ea4b_4790_9482_0d327c1873b3.slice/crio-24f20e198d31d78c270d3fa73df25e97f65392770b1f5d23db076fa57c445586 WatchSource:0}: Error finding container 24f20e198d31d78c270d3fa73df25e97f65392770b1f5d23db076fa57c445586: Status 404 returned error can't find the container with id 24f20e198d31d78c270d3fa73df25e97f65392770b1f5d23db076fa57c445586 Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.286791 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e37f410-52bf-4a62-86eb-f7cfa1659873-console-oauth-config\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.286872 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g48v\" (UniqueName: \"kubernetes.io/projected/8e37f410-52bf-4a62-86eb-f7cfa1659873-kube-api-access-6g48v\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.286907 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e37f410-52bf-4a62-86eb-f7cfa1659873-console-serving-cert\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.286933 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-console-config\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.286968 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-service-ca\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.287011 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-oauth-serving-cert\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.287049 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8dc2d16-42c4-438c-930b-c6de9707aa93-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-vvbkc\" (UID: \"a8dc2d16-42c4-438c-930b-c6de9707aa93\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.287085 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-trusted-ca-bundle\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.288200 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-console-config\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.288722 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-service-ca\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.288897 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-trusted-ca-bundle\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.291011 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e37f410-52bf-4a62-86eb-f7cfa1659873-oauth-serving-cert\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.291729 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e37f410-52bf-4a62-86eb-f7cfa1659873-console-oauth-config\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.292847 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e37f410-52bf-4a62-86eb-f7cfa1659873-console-serving-cert\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.294572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8dc2d16-42c4-438c-930b-c6de9707aa93-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-vvbkc\" (UID: \"a8dc2d16-42c4-438c-930b-c6de9707aa93\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.307031 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g48v\" (UniqueName: \"kubernetes.io/projected/8e37f410-52bf-4a62-86eb-f7cfa1659873-kube-api-access-6g48v\") pod \"console-75f565c5d8-mmxgw\" (UID: \"8e37f410-52bf-4a62-86eb-f7cfa1659873\") " pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.357152 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn"] Oct 06 15:05:41 crc kubenswrapper[4763]: W1006 15:05:41.370936 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f21f303_50e0_4f36_aa64_53c5ae4f27c0.slice/crio-39c30ecc9a143b9be4c9bd29d70bf37308a25fe40a3270e737c95e4535c70870 WatchSource:0}: Error finding container 39c30ecc9a143b9be4c9bd29d70bf37308a25fe40a3270e737c95e4535c70870: Status 404 returned error can't find the container with id 39c30ecc9a143b9be4c9bd29d70bf37308a25fe40a3270e737c95e4535c70870 Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.402721 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.513294 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.706689 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc"] Oct 06 15:05:41 crc kubenswrapper[4763]: W1006 15:05:41.712095 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8dc2d16_42c4_438c_930b_c6de9707aa93.slice/crio-78754d64ea43e4cea839f343c81bd0d63548bec9057325ba0008cc32b627857d WatchSource:0}: Error finding container 78754d64ea43e4cea839f343c81bd0d63548bec9057325ba0008cc32b627857d: Status 404 returned error can't find the container with id 78754d64ea43e4cea839f343c81bd0d63548bec9057325ba0008cc32b627857d Oct 06 15:05:41 crc kubenswrapper[4763]: I1006 15:05:41.815318 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75f565c5d8-mmxgw"] Oct 06 15:05:41 crc kubenswrapper[4763]: W1006 15:05:41.828889 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e37f410_52bf_4a62_86eb_f7cfa1659873.slice/crio-b01fb8bd0454449a5da5b68f423e60e39b51bf6a858762158780e660e9d5f0c7 WatchSource:0}: Error finding container b01fb8bd0454449a5da5b68f423e60e39b51bf6a858762158780e660e9d5f0c7: Status 404 returned error can't find the container with id b01fb8bd0454449a5da5b68f423e60e39b51bf6a858762158780e660e9d5f0c7 Oct 06 15:05:42 crc kubenswrapper[4763]: I1006 15:05:42.108243 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" event={"ID":"9f18885b-ea4b-4790-9482-0d327c1873b3","Type":"ContainerStarted","Data":"24f20e198d31d78c270d3fa73df25e97f65392770b1f5d23db076fa57c445586"} Oct 06 15:05:42 crc kubenswrapper[4763]: I1006 15:05:42.109832 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75f565c5d8-mmxgw" event={"ID":"8e37f410-52bf-4a62-86eb-f7cfa1659873","Type":"ContainerStarted","Data":"cd12fc4c5cd6f1eb67caf76229ad06f4c55c323384c7ef80d4f6b62f0ee40f8c"} Oct 06 15:05:42 crc kubenswrapper[4763]: I1006 15:05:42.109873 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75f565c5d8-mmxgw" event={"ID":"8e37f410-52bf-4a62-86eb-f7cfa1659873","Type":"ContainerStarted","Data":"b01fb8bd0454449a5da5b68f423e60e39b51bf6a858762158780e660e9d5f0c7"} Oct 06 15:05:42 crc kubenswrapper[4763]: I1006 15:05:42.111287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" event={"ID":"a8dc2d16-42c4-438c-930b-c6de9707aa93","Type":"ContainerStarted","Data":"78754d64ea43e4cea839f343c81bd0d63548bec9057325ba0008cc32b627857d"} Oct 06 15:05:42 crc kubenswrapper[4763]: I1006 15:05:42.112569 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn" event={"ID":"5f21f303-50e0-4f36-aa64-53c5ae4f27c0","Type":"ContainerStarted","Data":"39c30ecc9a143b9be4c9bd29d70bf37308a25fe40a3270e737c95e4535c70870"} Oct 06 15:05:42 crc kubenswrapper[4763]: I1006 15:05:42.135239 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75f565c5d8-mmxgw" podStartSLOduration=1.135223946 podStartE2EDuration="1.135223946s" podCreationTimestamp="2025-10-06 15:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:05:42.132252113 +0000 UTC m=+739.287544635" watchObservedRunningTime="2025-10-06 15:05:42.135223946 +0000 UTC m=+739.290516458" Oct 06 15:05:45 crc kubenswrapper[4763]: I1006 15:05:45.137017 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s4ftg" event={"ID":"6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef","Type":"ContainerStarted","Data":"c5b991d51870d7aaa40d310d898030f89700b0df06c63cea568c736d3da4672b"} Oct 06 15:05:45 crc kubenswrapper[4763]: I1006 15:05:45.137667 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:45 crc kubenswrapper[4763]: I1006 15:05:45.140118 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" event={"ID":"a8dc2d16-42c4-438c-930b-c6de9707aa93","Type":"ContainerStarted","Data":"2bc9c4d054c2a57d000468dcdae950c9a034c6c1ca4593be1cf20aea1a3508fb"} Oct 06 15:05:45 crc kubenswrapper[4763]: I1006 15:05:45.140221 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:05:45 crc kubenswrapper[4763]: I1006 15:05:45.141441 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn" event={"ID":"5f21f303-50e0-4f36-aa64-53c5ae4f27c0","Type":"ContainerStarted","Data":"ab37c953037ea5ea8380622c0cdce6963d4ffeb285ee5b5c51720214364a9ca0"} Oct 06 15:05:45 crc kubenswrapper[4763]: I1006 15:05:45.142705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" event={"ID":"9f18885b-ea4b-4790-9482-0d327c1873b3","Type":"ContainerStarted","Data":"19fd1d1723168f42c45034e5018d8c9453f5e4202bf9e6f1623f14574bb307f2"} Oct 06 15:05:45 crc kubenswrapper[4763]: I1006 15:05:45.156883 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-s4ftg" podStartSLOduration=2.088789301 podStartE2EDuration="5.156858587s" podCreationTimestamp="2025-10-06 15:05:40 +0000 UTC" firstStartedPulling="2025-10-06 15:05:40.96406358 +0000 UTC m=+738.119356092" lastFinishedPulling="2025-10-06 15:05:44.032132866 +0000 UTC m=+741.187425378" observedRunningTime="2025-10-06 15:05:45.152031601 +0000 UTC m=+742.307324133" watchObservedRunningTime="2025-10-06 15:05:45.156858587 +0000 UTC m=+742.312151119" Oct 06 15:05:45 crc kubenswrapper[4763]: I1006 15:05:45.167660 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" podStartSLOduration=2.841857104 podStartE2EDuration="5.167611409s" podCreationTimestamp="2025-10-06 15:05:40 +0000 UTC" firstStartedPulling="2025-10-06 15:05:41.714148339 +0000 UTC m=+738.869440861" lastFinishedPulling="2025-10-06 15:05:44.039902664 +0000 UTC m=+741.195195166" observedRunningTime="2025-10-06 15:05:45.16691649 +0000 UTC m=+742.322209022" watchObservedRunningTime="2025-10-06 15:05:45.167611409 +0000 UTC m=+742.322903931" Oct 06 15:05:45 crc kubenswrapper[4763]: I1006 15:05:45.237478 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-nbrs9" podStartSLOduration=2.426438137 podStartE2EDuration="5.237461352s" podCreationTimestamp="2025-10-06 15:05:40 +0000 UTC" firstStartedPulling="2025-10-06 15:05:41.223018154 +0000 UTC m=+738.378310656" lastFinishedPulling="2025-10-06 15:05:44.034041359 +0000 UTC m=+741.189333871" observedRunningTime="2025-10-06 15:05:45.236326331 +0000 UTC m=+742.391618863" watchObservedRunningTime="2025-10-06 15:05:45.237461352 +0000 UTC m=+742.392753864" Oct 06 15:05:48 crc kubenswrapper[4763]: I1006 15:05:48.166773 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn" event={"ID":"5f21f303-50e0-4f36-aa64-53c5ae4f27c0","Type":"ContainerStarted","Data":"391c6cbd8459857d8da13c85c7f7a14e9cf7a4acab31e519b75b3f9ed59b7fce"} Oct 06 15:05:48 crc kubenswrapper[4763]: I1006 15:05:48.190226 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kvgpn" podStartSLOduration=2.233097054 podStartE2EDuration="8.190207257s" podCreationTimestamp="2025-10-06 15:05:40 +0000 UTC" firstStartedPulling="2025-10-06 15:05:41.374030285 +0000 UTC m=+738.529322797" lastFinishedPulling="2025-10-06 15:05:47.331140478 +0000 UTC m=+744.486433000" observedRunningTime="2025-10-06 15:05:48.188788937 +0000 UTC m=+745.344081509" watchObservedRunningTime="2025-10-06 15:05:48.190207257 +0000 UTC m=+745.345499769" Oct 06 15:05:50 crc kubenswrapper[4763]: I1006 15:05:50.960325 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-s4ftg" Oct 06 15:05:51 crc kubenswrapper[4763]: I1006 15:05:51.403172 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:51 crc kubenswrapper[4763]: I1006 15:05:51.403232 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:51 crc kubenswrapper[4763]: I1006 15:05:51.410421 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.099968 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pr9h9"] Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.100291 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" podUID="b02ef795-1c7c-49fe-bd12-00043942b97f" containerName="controller-manager" containerID="cri-o://24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87" gracePeriod=30 Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.186347 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h"] Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.186745 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" podUID="7a6bf75e-d386-4a0b-92ba-fa9b01645e4a" containerName="route-controller-manager" containerID="cri-o://0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d" gracePeriod=30 Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.196509 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75f565c5d8-mmxgw" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.242397 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-t2f4r"] Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.537694 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.541383 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.680431 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-proxy-ca-bundles\") pod \"b02ef795-1c7c-49fe-bd12-00043942b97f\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.681528 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b02ef795-1c7c-49fe-bd12-00043942b97f" (UID: "b02ef795-1c7c-49fe-bd12-00043942b97f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.681714 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-config\") pod \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.682477 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-config" (OuterVolumeSpecName: "config") pod "7a6bf75e-d386-4a0b-92ba-fa9b01645e4a" (UID: "7a6bf75e-d386-4a0b-92ba-fa9b01645e4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.682560 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-client-ca\") pod \"b02ef795-1c7c-49fe-bd12-00043942b97f\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.683240 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-client-ca" (OuterVolumeSpecName: "client-ca") pod "b02ef795-1c7c-49fe-bd12-00043942b97f" (UID: "b02ef795-1c7c-49fe-bd12-00043942b97f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.683280 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx586\" (UniqueName: \"kubernetes.io/projected/b02ef795-1c7c-49fe-bd12-00043942b97f-kube-api-access-jx586\") pod \"b02ef795-1c7c-49fe-bd12-00043942b97f\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.683329 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-client-ca\") pod \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.683355 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-config\") pod \"b02ef795-1c7c-49fe-bd12-00043942b97f\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.684111 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-config" (OuterVolumeSpecName: "config") pod "b02ef795-1c7c-49fe-bd12-00043942b97f" (UID: "b02ef795-1c7c-49fe-bd12-00043942b97f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.684660 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-client-ca" (OuterVolumeSpecName: "client-ca") pod "7a6bf75e-d386-4a0b-92ba-fa9b01645e4a" (UID: "7a6bf75e-d386-4a0b-92ba-fa9b01645e4a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.684787 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-serving-cert\") pod \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.684814 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b02ef795-1c7c-49fe-bd12-00043942b97f-serving-cert\") pod \"b02ef795-1c7c-49fe-bd12-00043942b97f\" (UID: \"b02ef795-1c7c-49fe-bd12-00043942b97f\") " Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.685477 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh9gj\" (UniqueName: \"kubernetes.io/projected/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-kube-api-access-nh9gj\") pod \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\" (UID: \"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a\") " Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.685866 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.685894 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.685906 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.685918 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.685930 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02ef795-1c7c-49fe-bd12-00043942b97f-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.688569 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b02ef795-1c7c-49fe-bd12-00043942b97f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b02ef795-1c7c-49fe-bd12-00043942b97f" (UID: "b02ef795-1c7c-49fe-bd12-00043942b97f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.689024 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7a6bf75e-d386-4a0b-92ba-fa9b01645e4a" (UID: "7a6bf75e-d386-4a0b-92ba-fa9b01645e4a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.689324 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b02ef795-1c7c-49fe-bd12-00043942b97f-kube-api-access-jx586" (OuterVolumeSpecName: "kube-api-access-jx586") pod "b02ef795-1c7c-49fe-bd12-00043942b97f" (UID: "b02ef795-1c7c-49fe-bd12-00043942b97f"). InnerVolumeSpecName "kube-api-access-jx586". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.690561 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-kube-api-access-nh9gj" (OuterVolumeSpecName: "kube-api-access-nh9gj") pod "7a6bf75e-d386-4a0b-92ba-fa9b01645e4a" (UID: "7a6bf75e-d386-4a0b-92ba-fa9b01645e4a"). InnerVolumeSpecName "kube-api-access-nh9gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.787338 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx586\" (UniqueName: \"kubernetes.io/projected/b02ef795-1c7c-49fe-bd12-00043942b97f-kube-api-access-jx586\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.787382 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.787394 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b02ef795-1c7c-49fe-bd12-00043942b97f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:52 crc kubenswrapper[4763]: I1006 15:05:52.787405 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh9gj\" (UniqueName: \"kubernetes.io/projected/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a-kube-api-access-nh9gj\") on node \"crc\" DevicePath \"\"" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.198909 4763 generic.go:334] "Generic (PLEG): container finished" podID="b02ef795-1c7c-49fe-bd12-00043942b97f" containerID="24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87" exitCode=0 Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.198963 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.198982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" event={"ID":"b02ef795-1c7c-49fe-bd12-00043942b97f","Type":"ContainerDied","Data":"24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87"} Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.200347 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pr9h9" event={"ID":"b02ef795-1c7c-49fe-bd12-00043942b97f","Type":"ContainerDied","Data":"761eca6bdb67c1d45da3f0e5487c143cebc1505b33f80685355f72494d34f2fc"} Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.200384 4763 scope.go:117] "RemoveContainer" containerID="24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.204135 4763 generic.go:334] "Generic (PLEG): container finished" podID="7a6bf75e-d386-4a0b-92ba-fa9b01645e4a" containerID="0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d" exitCode=0 Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.204169 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.204184 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" event={"ID":"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a","Type":"ContainerDied","Data":"0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d"} Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.206109 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h" event={"ID":"7a6bf75e-d386-4a0b-92ba-fa9b01645e4a","Type":"ContainerDied","Data":"f9c26c69457ba6d8eeb7e4451aa02b12d463180a240f1f10a01175a13ee0e713"} Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.227481 4763 scope.go:117] "RemoveContainer" containerID="24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87" Oct 06 15:05:53 crc kubenswrapper[4763]: E1006 15:05:53.228899 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87\": container with ID starting with 24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87 not found: ID does not exist" containerID="24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.229071 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87"} err="failed to get container status \"24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87\": rpc error: code = NotFound desc = could not find container \"24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87\": container with ID starting with 24785398540c2429ee6000453deecd72ad2043fdb8b574d5f5a0d72bd1fb2e87 not found: ID does not exist" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.229171 4763 scope.go:117] "RemoveContainer" containerID="0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.229661 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pr9h9"] Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.246773 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pr9h9"] Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.252482 4763 scope.go:117] "RemoveContainer" containerID="0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.252901 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h"] Oct 06 15:05:53 crc kubenswrapper[4763]: E1006 15:05:53.253092 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d\": container with ID starting with 0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d not found: ID does not exist" containerID="0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.253122 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d"} err="failed to get container status \"0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d\": rpc error: code = NotFound desc = could not find container \"0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d\": container with ID starting with 0f4b67d6fe6affeee4bcb6f1f8d9668072e5505088ccda7b5408e7393fe1d23d not found: ID does not exist" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.258009 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dh45h"] Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.581368 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a6bf75e-d386-4a0b-92ba-fa9b01645e4a" path="/var/lib/kubelet/pods/7a6bf75e-d386-4a0b-92ba-fa9b01645e4a/volumes" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.582517 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b02ef795-1c7c-49fe-bd12-00043942b97f" path="/var/lib/kubelet/pods/b02ef795-1c7c-49fe-bd12-00043942b97f/volumes" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.897036 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-757f987fcf-7tds7"] Oct 06 15:05:53 crc kubenswrapper[4763]: E1006 15:05:53.897292 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b02ef795-1c7c-49fe-bd12-00043942b97f" containerName="controller-manager" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.897308 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b02ef795-1c7c-49fe-bd12-00043942b97f" containerName="controller-manager" Oct 06 15:05:53 crc kubenswrapper[4763]: E1006 15:05:53.897330 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a6bf75e-d386-4a0b-92ba-fa9b01645e4a" containerName="route-controller-manager" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.897337 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6bf75e-d386-4a0b-92ba-fa9b01645e4a" containerName="route-controller-manager" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.897466 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a6bf75e-d386-4a0b-92ba-fa9b01645e4a" containerName="route-controller-manager" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.897487 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b02ef795-1c7c-49fe-bd12-00043942b97f" containerName="controller-manager" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.897913 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.900894 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.901139 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.901292 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.901536 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.901757 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.901936 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.902311 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7"] Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.902844 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.904882 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.905570 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.905719 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.905816 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.905967 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.906189 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.911431 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-757f987fcf-7tds7"] Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.935471 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 15:05:53 crc kubenswrapper[4763]: I1006 15:05:53.943011 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7"] Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.003368 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8nks\" (UniqueName: \"kubernetes.io/projected/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-kube-api-access-b8nks\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.003415 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-config\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.003454 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-client-ca\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.003579 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82f704fd-3a1a-4735-94d3-075d0bb300e7-serving-cert\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.003648 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgg6\" (UniqueName: \"kubernetes.io/projected/82f704fd-3a1a-4735-94d3-075d0bb300e7-kube-api-access-lmgg6\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.003670 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82f704fd-3a1a-4735-94d3-075d0bb300e7-client-ca\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.003685 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f704fd-3a1a-4735-94d3-075d0bb300e7-config\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.003712 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-serving-cert\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.003728 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-proxy-ca-bundles\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.105956 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8nks\" (UniqueName: \"kubernetes.io/projected/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-kube-api-access-b8nks\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.106025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-config\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.106095 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-client-ca\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.106135 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82f704fd-3a1a-4735-94d3-075d0bb300e7-serving-cert\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.106191 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgg6\" (UniqueName: \"kubernetes.io/projected/82f704fd-3a1a-4735-94d3-075d0bb300e7-kube-api-access-lmgg6\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.106218 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82f704fd-3a1a-4735-94d3-075d0bb300e7-client-ca\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.106259 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f704fd-3a1a-4735-94d3-075d0bb300e7-config\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.106285 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-serving-cert\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.106306 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-proxy-ca-bundles\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.108254 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-proxy-ca-bundles\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.110417 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-config\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.111316 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-client-ca\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.112848 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82f704fd-3a1a-4735-94d3-075d0bb300e7-client-ca\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.113955 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f704fd-3a1a-4735-94d3-075d0bb300e7-config\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.129673 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8nks\" (UniqueName: \"kubernetes.io/projected/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-kube-api-access-b8nks\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.136092 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82f704fd-3a1a-4735-94d3-075d0bb300e7-serving-cert\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.136358 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0935265-2a03-4dbf-ab7f-970d8f3bfd34-serving-cert\") pod \"controller-manager-757f987fcf-7tds7\" (UID: \"a0935265-2a03-4dbf-ab7f-970d8f3bfd34\") " pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.152507 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgg6\" (UniqueName: \"kubernetes.io/projected/82f704fd-3a1a-4735-94d3-075d0bb300e7-kube-api-access-lmgg6\") pod \"route-controller-manager-594fd789cf-khcv7\" (UID: \"82f704fd-3a1a-4735-94d3-075d0bb300e7\") " pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.229925 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.240281 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.461814 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7"] Oct 06 15:05:54 crc kubenswrapper[4763]: W1006 15:05:54.470286 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82f704fd_3a1a_4735_94d3_075d0bb300e7.slice/crio-bbff3dc5cc5702a61063217dd32ff706289fafc1bd3672508d727e5a51808296 WatchSource:0}: Error finding container bbff3dc5cc5702a61063217dd32ff706289fafc1bd3672508d727e5a51808296: Status 404 returned error can't find the container with id bbff3dc5cc5702a61063217dd32ff706289fafc1bd3672508d727e5a51808296 Oct 06 15:05:54 crc kubenswrapper[4763]: I1006 15:05:54.507197 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-757f987fcf-7tds7"] Oct 06 15:05:55 crc kubenswrapper[4763]: I1006 15:05:55.217905 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" event={"ID":"82f704fd-3a1a-4735-94d3-075d0bb300e7","Type":"ContainerStarted","Data":"a615b03a3361a50ef5825da55fc20e627a5d88f892184e7dd9a819b27dd9745e"} Oct 06 15:05:55 crc kubenswrapper[4763]: I1006 15:05:55.218335 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" event={"ID":"82f704fd-3a1a-4735-94d3-075d0bb300e7","Type":"ContainerStarted","Data":"bbff3dc5cc5702a61063217dd32ff706289fafc1bd3672508d727e5a51808296"} Oct 06 15:05:55 crc kubenswrapper[4763]: I1006 15:05:55.218368 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:55 crc kubenswrapper[4763]: I1006 15:05:55.220464 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" event={"ID":"a0935265-2a03-4dbf-ab7f-970d8f3bfd34","Type":"ContainerStarted","Data":"64eba2dfa6dfa0a940e70c752d4a93b381bf27fb0fc09a9af653475385c78dbf"} Oct 06 15:05:55 crc kubenswrapper[4763]: I1006 15:05:55.220526 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" event={"ID":"a0935265-2a03-4dbf-ab7f-970d8f3bfd34","Type":"ContainerStarted","Data":"f22773a3c5d995dc32becef0b1a0311be98b01840a309f91b6897263ec709394"} Oct 06 15:05:55 crc kubenswrapper[4763]: I1006 15:05:55.220911 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:55 crc kubenswrapper[4763]: I1006 15:05:55.226763 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" Oct 06 15:05:55 crc kubenswrapper[4763]: I1006 15:05:55.231197 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" Oct 06 15:05:55 crc kubenswrapper[4763]: I1006 15:05:55.240830 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-594fd789cf-khcv7" podStartSLOduration=2.240808312 podStartE2EDuration="2.240808312s" podCreationTimestamp="2025-10-06 15:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:05:55.239066813 +0000 UTC m=+752.394359325" watchObservedRunningTime="2025-10-06 15:05:55.240808312 +0000 UTC m=+752.396100834" Oct 06 15:05:55 crc kubenswrapper[4763]: I1006 15:05:55.259115 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-757f987fcf-7tds7" podStartSLOduration=3.2591001459999998 podStartE2EDuration="3.259100146s" podCreationTimestamp="2025-10-06 15:05:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:05:55.258218081 +0000 UTC m=+752.413510613" watchObservedRunningTime="2025-10-06 15:05:55.259100146 +0000 UTC m=+752.414392658" Oct 06 15:06:01 crc kubenswrapper[4763]: I1006 15:06:01.523510 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-vvbkc" Oct 06 15:06:02 crc kubenswrapper[4763]: I1006 15:06:02.595323 4763 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.748970 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6tvdc"] Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.751108 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.760922 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tvdc"] Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.778195 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-utilities\") pod \"redhat-marketplace-6tvdc\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.778455 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-catalog-content\") pod \"redhat-marketplace-6tvdc\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.778501 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcvqq\" (UniqueName: \"kubernetes.io/projected/d5daf3f9-9a26-47c2-88fa-4c2827989913-kube-api-access-fcvqq\") pod \"redhat-marketplace-6tvdc\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.880109 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-utilities\") pod \"redhat-marketplace-6tvdc\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.880261 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-catalog-content\") pod \"redhat-marketplace-6tvdc\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.880277 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcvqq\" (UniqueName: \"kubernetes.io/projected/d5daf3f9-9a26-47c2-88fa-4c2827989913-kube-api-access-fcvqq\") pod \"redhat-marketplace-6tvdc\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.881278 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-utilities\") pod \"redhat-marketplace-6tvdc\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.882720 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-catalog-content\") pod \"redhat-marketplace-6tvdc\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:12 crc kubenswrapper[4763]: I1006 15:06:12.907680 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcvqq\" (UniqueName: \"kubernetes.io/projected/d5daf3f9-9a26-47c2-88fa-4c2827989913-kube-api-access-fcvqq\") pod \"redhat-marketplace-6tvdc\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:13 crc kubenswrapper[4763]: I1006 15:06:13.067100 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:13 crc kubenswrapper[4763]: I1006 15:06:13.511554 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tvdc"] Oct 06 15:06:13 crc kubenswrapper[4763]: W1006 15:06:13.528875 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5daf3f9_9a26_47c2_88fa_4c2827989913.slice/crio-1a34a8a3aaecdadbe221489bbe5651b7b50b48141b5c88a4742af7f6af44b742 WatchSource:0}: Error finding container 1a34a8a3aaecdadbe221489bbe5651b7b50b48141b5c88a4742af7f6af44b742: Status 404 returned error can't find the container with id 1a34a8a3aaecdadbe221489bbe5651b7b50b48141b5c88a4742af7f6af44b742 Oct 06 15:06:14 crc kubenswrapper[4763]: I1006 15:06:14.346379 4763 generic.go:334] "Generic (PLEG): container finished" podID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerID="c6c97ed67738da6dbfd5cb620e4bc135bab8b4922a38071859e705b9b739b0d9" exitCode=0 Oct 06 15:06:14 crc kubenswrapper[4763]: I1006 15:06:14.346685 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tvdc" event={"ID":"d5daf3f9-9a26-47c2-88fa-4c2827989913","Type":"ContainerDied","Data":"c6c97ed67738da6dbfd5cb620e4bc135bab8b4922a38071859e705b9b739b0d9"} Oct 06 15:06:14 crc kubenswrapper[4763]: I1006 15:06:14.347184 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tvdc" event={"ID":"d5daf3f9-9a26-47c2-88fa-4c2827989913","Type":"ContainerStarted","Data":"1a34a8a3aaecdadbe221489bbe5651b7b50b48141b5c88a4742af7f6af44b742"} Oct 06 15:06:15 crc kubenswrapper[4763]: I1006 15:06:15.354143 4763 generic.go:334] "Generic (PLEG): container finished" podID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerID="a0720add2bac8aa17cf8bc8c60a3b9a890afab612211723e3a9449e907427467" exitCode=0 Oct 06 15:06:15 crc kubenswrapper[4763]: I1006 15:06:15.354297 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tvdc" event={"ID":"d5daf3f9-9a26-47c2-88fa-4c2827989913","Type":"ContainerDied","Data":"a0720add2bac8aa17cf8bc8c60a3b9a890afab612211723e3a9449e907427467"} Oct 06 15:06:16 crc kubenswrapper[4763]: I1006 15:06:16.369734 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tvdc" event={"ID":"d5daf3f9-9a26-47c2-88fa-4c2827989913","Type":"ContainerStarted","Data":"50acef9ebf899f7ad531e9978f5f77f10078fd86d915c705945751a6a754d2a1"} Oct 06 15:06:16 crc kubenswrapper[4763]: I1006 15:06:16.393076 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6tvdc" podStartSLOduration=2.92221151 podStartE2EDuration="4.393044911s" podCreationTimestamp="2025-10-06 15:06:12 +0000 UTC" firstStartedPulling="2025-10-06 15:06:14.352847397 +0000 UTC m=+771.508139909" lastFinishedPulling="2025-10-06 15:06:15.823680768 +0000 UTC m=+772.978973310" observedRunningTime="2025-10-06 15:06:16.392396543 +0000 UTC m=+773.547689095" watchObservedRunningTime="2025-10-06 15:06:16.393044911 +0000 UTC m=+773.548337463" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.300579 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-t2f4r" podUID="fc281a39-2f6b-407d-a27e-0d19025186d7" containerName="console" containerID="cri-o://4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f" gracePeriod=15 Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.802368 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-t2f4r_fc281a39-2f6b-407d-a27e-0d19025186d7/console/0.log" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.802745 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.848581 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-console-config\") pod \"fc281a39-2f6b-407d-a27e-0d19025186d7\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.848642 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-oauth-serving-cert\") pod \"fc281a39-2f6b-407d-a27e-0d19025186d7\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.848671 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-serving-cert\") pod \"fc281a39-2f6b-407d-a27e-0d19025186d7\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.848753 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrj8q\" (UniqueName: \"kubernetes.io/projected/fc281a39-2f6b-407d-a27e-0d19025186d7-kube-api-access-qrj8q\") pod \"fc281a39-2f6b-407d-a27e-0d19025186d7\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.848772 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-service-ca\") pod \"fc281a39-2f6b-407d-a27e-0d19025186d7\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.848793 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-oauth-config\") pod \"fc281a39-2f6b-407d-a27e-0d19025186d7\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.848826 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-trusted-ca-bundle\") pod \"fc281a39-2f6b-407d-a27e-0d19025186d7\" (UID: \"fc281a39-2f6b-407d-a27e-0d19025186d7\") " Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.849602 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-console-config" (OuterVolumeSpecName: "console-config") pod "fc281a39-2f6b-407d-a27e-0d19025186d7" (UID: "fc281a39-2f6b-407d-a27e-0d19025186d7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.849672 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fc281a39-2f6b-407d-a27e-0d19025186d7" (UID: "fc281a39-2f6b-407d-a27e-0d19025186d7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.849697 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-service-ca" (OuterVolumeSpecName: "service-ca") pod "fc281a39-2f6b-407d-a27e-0d19025186d7" (UID: "fc281a39-2f6b-407d-a27e-0d19025186d7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.850139 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fc281a39-2f6b-407d-a27e-0d19025186d7" (UID: "fc281a39-2f6b-407d-a27e-0d19025186d7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.861881 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fc281a39-2f6b-407d-a27e-0d19025186d7" (UID: "fc281a39-2f6b-407d-a27e-0d19025186d7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.861936 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc281a39-2f6b-407d-a27e-0d19025186d7-kube-api-access-qrj8q" (OuterVolumeSpecName: "kube-api-access-qrj8q") pod "fc281a39-2f6b-407d-a27e-0d19025186d7" (UID: "fc281a39-2f6b-407d-a27e-0d19025186d7"). InnerVolumeSpecName "kube-api-access-qrj8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.862189 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fc281a39-2f6b-407d-a27e-0d19025186d7" (UID: "fc281a39-2f6b-407d-a27e-0d19025186d7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.950410 4763 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.950460 4763 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.950479 4763 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.950496 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrj8q\" (UniqueName: \"kubernetes.io/projected/fc281a39-2f6b-407d-a27e-0d19025186d7-kube-api-access-qrj8q\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.950517 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.950533 4763 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc281a39-2f6b-407d-a27e-0d19025186d7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:17 crc kubenswrapper[4763]: I1006 15:06:17.950551 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc281a39-2f6b-407d-a27e-0d19025186d7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:18 crc kubenswrapper[4763]: I1006 15:06:18.387415 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-t2f4r_fc281a39-2f6b-407d-a27e-0d19025186d7/console/0.log" Oct 06 15:06:18 crc kubenswrapper[4763]: I1006 15:06:18.387486 4763 generic.go:334] "Generic (PLEG): container finished" podID="fc281a39-2f6b-407d-a27e-0d19025186d7" containerID="4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f" exitCode=2 Oct 06 15:06:18 crc kubenswrapper[4763]: I1006 15:06:18.387528 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t2f4r" event={"ID":"fc281a39-2f6b-407d-a27e-0d19025186d7","Type":"ContainerDied","Data":"4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f"} Oct 06 15:06:18 crc kubenswrapper[4763]: I1006 15:06:18.387564 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t2f4r" event={"ID":"fc281a39-2f6b-407d-a27e-0d19025186d7","Type":"ContainerDied","Data":"8d69643ee0562150fc8e8ea83829bf2dfeba0445db944f83458f1642412473d1"} Oct 06 15:06:18 crc kubenswrapper[4763]: I1006 15:06:18.387593 4763 scope.go:117] "RemoveContainer" containerID="4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f" Oct 06 15:06:18 crc kubenswrapper[4763]: I1006 15:06:18.387778 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t2f4r" Oct 06 15:06:18 crc kubenswrapper[4763]: I1006 15:06:18.424934 4763 scope.go:117] "RemoveContainer" containerID="4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f" Oct 06 15:06:18 crc kubenswrapper[4763]: E1006 15:06:18.427081 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f\": container with ID starting with 4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f not found: ID does not exist" containerID="4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f" Oct 06 15:06:18 crc kubenswrapper[4763]: I1006 15:06:18.427175 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f"} err="failed to get container status \"4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f\": rpc error: code = NotFound desc = could not find container \"4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f\": container with ID starting with 4765dfb34d5347566a25849ac81c383f59cfacc5a9fe7a1cd28a9b45bdc5691f not found: ID does not exist" Oct 06 15:06:18 crc kubenswrapper[4763]: I1006 15:06:18.433709 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-t2f4r"] Oct 06 15:06:18 crc kubenswrapper[4763]: I1006 15:06:18.437754 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-t2f4r"] Oct 06 15:06:19 crc kubenswrapper[4763]: I1006 15:06:19.589537 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc281a39-2f6b-407d-a27e-0d19025186d7" path="/var/lib/kubelet/pods/fc281a39-2f6b-407d-a27e-0d19025186d7/volumes" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.067946 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.068293 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.132709 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.487590 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.761933 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gr2vs"] Oct 06 15:06:23 crc kubenswrapper[4763]: E1006 15:06:23.762439 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc281a39-2f6b-407d-a27e-0d19025186d7" containerName="console" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.762479 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc281a39-2f6b-407d-a27e-0d19025186d7" containerName="console" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.762795 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc281a39-2f6b-407d-a27e-0d19025186d7" containerName="console" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.764828 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.764848 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gr2vs"] Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.870864 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-catalog-content\") pod \"redhat-operators-gr2vs\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.871255 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-utilities\") pod \"redhat-operators-gr2vs\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.871413 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsmgb\" (UniqueName: \"kubernetes.io/projected/2673a5a7-684d-422b-bca1-b1b8f79b31d1-kube-api-access-xsmgb\") pod \"redhat-operators-gr2vs\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.972202 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsmgb\" (UniqueName: \"kubernetes.io/projected/2673a5a7-684d-422b-bca1-b1b8f79b31d1-kube-api-access-xsmgb\") pod \"redhat-operators-gr2vs\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.972308 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-catalog-content\") pod \"redhat-operators-gr2vs\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.972342 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-utilities\") pod \"redhat-operators-gr2vs\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.973042 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-utilities\") pod \"redhat-operators-gr2vs\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.973179 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-catalog-content\") pod \"redhat-operators-gr2vs\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:23 crc kubenswrapper[4763]: I1006 15:06:23.995761 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsmgb\" (UniqueName: \"kubernetes.io/projected/2673a5a7-684d-422b-bca1-b1b8f79b31d1-kube-api-access-xsmgb\") pod \"redhat-operators-gr2vs\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:24 crc kubenswrapper[4763]: I1006 15:06:24.145520 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:24 crc kubenswrapper[4763]: I1006 15:06:24.547745 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gr2vs"] Oct 06 15:06:24 crc kubenswrapper[4763]: W1006 15:06:24.554951 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2673a5a7_684d_422b_bca1_b1b8f79b31d1.slice/crio-a3edc7662492aeeb9573271d0f7d21b9bd541e1d67ccb0bc87e3ac794d42aed7 WatchSource:0}: Error finding container a3edc7662492aeeb9573271d0f7d21b9bd541e1d67ccb0bc87e3ac794d42aed7: Status 404 returned error can't find the container with id a3edc7662492aeeb9573271d0f7d21b9bd541e1d67ccb0bc87e3ac794d42aed7 Oct 06 15:06:25 crc kubenswrapper[4763]: I1006 15:06:25.435324 4763 generic.go:334] "Generic (PLEG): container finished" podID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerID="3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a" exitCode=0 Oct 06 15:06:25 crc kubenswrapper[4763]: I1006 15:06:25.435368 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2vs" event={"ID":"2673a5a7-684d-422b-bca1-b1b8f79b31d1","Type":"ContainerDied","Data":"3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a"} Oct 06 15:06:25 crc kubenswrapper[4763]: I1006 15:06:25.435393 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2vs" event={"ID":"2673a5a7-684d-422b-bca1-b1b8f79b31d1","Type":"ContainerStarted","Data":"a3edc7662492aeeb9573271d0f7d21b9bd541e1d67ccb0bc87e3ac794d42aed7"} Oct 06 15:06:26 crc kubenswrapper[4763]: I1006 15:06:26.449766 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2vs" event={"ID":"2673a5a7-684d-422b-bca1-b1b8f79b31d1","Type":"ContainerStarted","Data":"b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31"} Oct 06 15:06:27 crc kubenswrapper[4763]: I1006 15:06:27.461222 4763 generic.go:334] "Generic (PLEG): container finished" podID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerID="b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31" exitCode=0 Oct 06 15:06:27 crc kubenswrapper[4763]: I1006 15:06:27.461395 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2vs" event={"ID":"2673a5a7-684d-422b-bca1-b1b8f79b31d1","Type":"ContainerDied","Data":"b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31"} Oct 06 15:06:27 crc kubenswrapper[4763]: I1006 15:06:27.803543 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8"] Oct 06 15:06:27 crc kubenswrapper[4763]: I1006 15:06:27.805735 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:27 crc kubenswrapper[4763]: I1006 15:06:27.808857 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 15:06:27 crc kubenswrapper[4763]: I1006 15:06:27.827562 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8"] Oct 06 15:06:27 crc kubenswrapper[4763]: I1006 15:06:27.994875 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:27 crc kubenswrapper[4763]: I1006 15:06:27.994984 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:27 crc kubenswrapper[4763]: I1006 15:06:27.995066 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf9ck\" (UniqueName: \"kubernetes.io/projected/14a63df7-b6aa-47b5-a09c-0c8b4af726df-kube-api-access-wf9ck\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.096836 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf9ck\" (UniqueName: \"kubernetes.io/projected/14a63df7-b6aa-47b5-a09c-0c8b4af726df-kube-api-access-wf9ck\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.096939 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.096987 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.097939 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.098925 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.130268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf9ck\" (UniqueName: \"kubernetes.io/projected/14a63df7-b6aa-47b5-a09c-0c8b4af726df-kube-api-access-wf9ck\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.135965 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.147136 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tvdc"] Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.147529 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6tvdc" podUID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerName="registry-server" containerID="cri-o://50acef9ebf899f7ad531e9978f5f77f10078fd86d915c705945751a6a754d2a1" gracePeriod=2 Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.469449 4763 generic.go:334] "Generic (PLEG): container finished" podID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerID="50acef9ebf899f7ad531e9978f5f77f10078fd86d915c705945751a6a754d2a1" exitCode=0 Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.469714 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tvdc" event={"ID":"d5daf3f9-9a26-47c2-88fa-4c2827989913","Type":"ContainerDied","Data":"50acef9ebf899f7ad531e9978f5f77f10078fd86d915c705945751a6a754d2a1"} Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.471148 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2vs" event={"ID":"2673a5a7-684d-422b-bca1-b1b8f79b31d1","Type":"ContainerStarted","Data":"d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87"} Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.491259 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gr2vs" podStartSLOduration=3.01293508 podStartE2EDuration="5.49124107s" podCreationTimestamp="2025-10-06 15:06:23 +0000 UTC" firstStartedPulling="2025-10-06 15:06:25.438521992 +0000 UTC m=+782.593814534" lastFinishedPulling="2025-10-06 15:06:27.916827992 +0000 UTC m=+785.072120524" observedRunningTime="2025-10-06 15:06:28.490749126 +0000 UTC m=+785.646041718" watchObservedRunningTime="2025-10-06 15:06:28.49124107 +0000 UTC m=+785.646533592" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.632222 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8"] Oct 06 15:06:28 crc kubenswrapper[4763]: W1006 15:06:28.635069 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a63df7_b6aa_47b5_a09c_0c8b4af726df.slice/crio-deb5fed63f0ab16d56524b67438c00658e6e6fa5202f2dfbf80e468dff74866c WatchSource:0}: Error finding container deb5fed63f0ab16d56524b67438c00658e6e6fa5202f2dfbf80e468dff74866c: Status 404 returned error can't find the container with id deb5fed63f0ab16d56524b67438c00658e6e6fa5202f2dfbf80e468dff74866c Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.650654 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.806909 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-catalog-content\") pod \"d5daf3f9-9a26-47c2-88fa-4c2827989913\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.806944 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcvqq\" (UniqueName: \"kubernetes.io/projected/d5daf3f9-9a26-47c2-88fa-4c2827989913-kube-api-access-fcvqq\") pod \"d5daf3f9-9a26-47c2-88fa-4c2827989913\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.807029 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-utilities\") pod \"d5daf3f9-9a26-47c2-88fa-4c2827989913\" (UID: \"d5daf3f9-9a26-47c2-88fa-4c2827989913\") " Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.807784 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-utilities" (OuterVolumeSpecName: "utilities") pod "d5daf3f9-9a26-47c2-88fa-4c2827989913" (UID: "d5daf3f9-9a26-47c2-88fa-4c2827989913"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.811837 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5daf3f9-9a26-47c2-88fa-4c2827989913-kube-api-access-fcvqq" (OuterVolumeSpecName: "kube-api-access-fcvqq") pod "d5daf3f9-9a26-47c2-88fa-4c2827989913" (UID: "d5daf3f9-9a26-47c2-88fa-4c2827989913"). InnerVolumeSpecName "kube-api-access-fcvqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.838280 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5daf3f9-9a26-47c2-88fa-4c2827989913" (UID: "d5daf3f9-9a26-47c2-88fa-4c2827989913"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.908490 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.908542 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcvqq\" (UniqueName: \"kubernetes.io/projected/d5daf3f9-9a26-47c2-88fa-4c2827989913-kube-api-access-fcvqq\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:28 crc kubenswrapper[4763]: I1006 15:06:28.908562 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5daf3f9-9a26-47c2-88fa-4c2827989913-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.479833 4763 generic.go:334] "Generic (PLEG): container finished" podID="14a63df7-b6aa-47b5-a09c-0c8b4af726df" containerID="6ba686ac68a7093a4ecf0ce6f054d8695cee39659bd869d6b83975bdea4a7a51" exitCode=0 Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.479917 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" event={"ID":"14a63df7-b6aa-47b5-a09c-0c8b4af726df","Type":"ContainerDied","Data":"6ba686ac68a7093a4ecf0ce6f054d8695cee39659bd869d6b83975bdea4a7a51"} Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.479952 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" event={"ID":"14a63df7-b6aa-47b5-a09c-0c8b4af726df","Type":"ContainerStarted","Data":"deb5fed63f0ab16d56524b67438c00658e6e6fa5202f2dfbf80e468dff74866c"} Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.484303 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tvdc" Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.484551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tvdc" event={"ID":"d5daf3f9-9a26-47c2-88fa-4c2827989913","Type":"ContainerDied","Data":"1a34a8a3aaecdadbe221489bbe5651b7b50b48141b5c88a4742af7f6af44b742"} Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.484596 4763 scope.go:117] "RemoveContainer" containerID="50acef9ebf899f7ad531e9978f5f77f10078fd86d915c705945751a6a754d2a1" Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.507688 4763 scope.go:117] "RemoveContainer" containerID="a0720add2bac8aa17cf8bc8c60a3b9a890afab612211723e3a9449e907427467" Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.521834 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tvdc"] Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.525966 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tvdc"] Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.542198 4763 scope.go:117] "RemoveContainer" containerID="c6c97ed67738da6dbfd5cb620e4bc135bab8b4922a38071859e705b9b739b0d9" Oct 06 15:06:29 crc kubenswrapper[4763]: I1006 15:06:29.581826 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5daf3f9-9a26-47c2-88fa-4c2827989913" path="/var/lib/kubelet/pods/d5daf3f9-9a26-47c2-88fa-4c2827989913/volumes" Oct 06 15:06:31 crc kubenswrapper[4763]: I1006 15:06:31.504740 4763 generic.go:334] "Generic (PLEG): container finished" podID="14a63df7-b6aa-47b5-a09c-0c8b4af726df" containerID="5cfa263965139f9549baa4947e8e17dae80e0b1a250ce33dd2da21db2f7b7d88" exitCode=0 Oct 06 15:06:31 crc kubenswrapper[4763]: I1006 15:06:31.504782 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" event={"ID":"14a63df7-b6aa-47b5-a09c-0c8b4af726df","Type":"ContainerDied","Data":"5cfa263965139f9549baa4947e8e17dae80e0b1a250ce33dd2da21db2f7b7d88"} Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.512354 4763 generic.go:334] "Generic (PLEG): container finished" podID="14a63df7-b6aa-47b5-a09c-0c8b4af726df" containerID="4ed47474659f2d60d657a4b67929f8357b31c5b467284c85d4ff585ab97a6f4d" exitCode=0 Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.512407 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" event={"ID":"14a63df7-b6aa-47b5-a09c-0c8b4af726df","Type":"ContainerDied","Data":"4ed47474659f2d60d657a4b67929f8357b31c5b467284c85d4ff585ab97a6f4d"} Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.753326 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qbszm"] Oct 06 15:06:32 crc kubenswrapper[4763]: E1006 15:06:32.753524 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerName="registry-server" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.753535 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerName="registry-server" Oct 06 15:06:32 crc kubenswrapper[4763]: E1006 15:06:32.753551 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerName="extract-content" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.753558 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerName="extract-content" Oct 06 15:06:32 crc kubenswrapper[4763]: E1006 15:06:32.753575 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerName="extract-utilities" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.753581 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerName="extract-utilities" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.753686 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5daf3f9-9a26-47c2-88fa-4c2827989913" containerName="registry-server" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.754388 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.762526 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-utilities\") pod \"community-operators-qbszm\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.762580 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55rkb\" (UniqueName: \"kubernetes.io/projected/156aee71-ca2e-4c41-9ec6-6be508f8d024-kube-api-access-55rkb\") pod \"community-operators-qbszm\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.762669 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-catalog-content\") pod \"community-operators-qbszm\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.763979 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbszm"] Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.864184 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-catalog-content\") pod \"community-operators-qbszm\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.864278 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-utilities\") pod \"community-operators-qbszm\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.864319 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55rkb\" (UniqueName: \"kubernetes.io/projected/156aee71-ca2e-4c41-9ec6-6be508f8d024-kube-api-access-55rkb\") pod \"community-operators-qbszm\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.864792 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-catalog-content\") pod \"community-operators-qbszm\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.864944 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-utilities\") pod \"community-operators-qbszm\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:32 crc kubenswrapper[4763]: I1006 15:06:32.894507 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55rkb\" (UniqueName: \"kubernetes.io/projected/156aee71-ca2e-4c41-9ec6-6be508f8d024-kube-api-access-55rkb\") pod \"community-operators-qbszm\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:33 crc kubenswrapper[4763]: I1006 15:06:33.073018 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:33 crc kubenswrapper[4763]: I1006 15:06:33.508949 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbszm"] Oct 06 15:06:33 crc kubenswrapper[4763]: I1006 15:06:33.842724 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:33 crc kubenswrapper[4763]: I1006 15:06:33.877081 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:06:33 crc kubenswrapper[4763]: I1006 15:06:33.877136 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:06:33 crc kubenswrapper[4763]: I1006 15:06:33.977671 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf9ck\" (UniqueName: \"kubernetes.io/projected/14a63df7-b6aa-47b5-a09c-0c8b4af726df-kube-api-access-wf9ck\") pod \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " Oct 06 15:06:33 crc kubenswrapper[4763]: I1006 15:06:33.978182 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-util\") pod \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " Oct 06 15:06:33 crc kubenswrapper[4763]: I1006 15:06:33.978237 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-bundle\") pod \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\" (UID: \"14a63df7-b6aa-47b5-a09c-0c8b4af726df\") " Oct 06 15:06:33 crc kubenswrapper[4763]: I1006 15:06:33.979694 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-bundle" (OuterVolumeSpecName: "bundle") pod "14a63df7-b6aa-47b5-a09c-0c8b4af726df" (UID: "14a63df7-b6aa-47b5-a09c-0c8b4af726df"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:06:33 crc kubenswrapper[4763]: I1006 15:06:33.986136 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a63df7-b6aa-47b5-a09c-0c8b4af726df-kube-api-access-wf9ck" (OuterVolumeSpecName: "kube-api-access-wf9ck") pod "14a63df7-b6aa-47b5-a09c-0c8b4af726df" (UID: "14a63df7-b6aa-47b5-a09c-0c8b4af726df"). InnerVolumeSpecName "kube-api-access-wf9ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.080206 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.080463 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf9ck\" (UniqueName: \"kubernetes.io/projected/14a63df7-b6aa-47b5-a09c-0c8b4af726df-kube-api-access-wf9ck\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.146373 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.146435 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.174509 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-util" (OuterVolumeSpecName: "util") pod "14a63df7-b6aa-47b5-a09c-0c8b4af726df" (UID: "14a63df7-b6aa-47b5-a09c-0c8b4af726df"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.181602 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a63df7-b6aa-47b5-a09c-0c8b4af726df-util\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.227972 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.537723 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" event={"ID":"14a63df7-b6aa-47b5-a09c-0c8b4af726df","Type":"ContainerDied","Data":"deb5fed63f0ab16d56524b67438c00658e6e6fa5202f2dfbf80e468dff74866c"} Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.537805 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deb5fed63f0ab16d56524b67438c00658e6e6fa5202f2dfbf80e468dff74866c" Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.537742 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8" Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.540482 4763 generic.go:334] "Generic (PLEG): container finished" podID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerID="ab92609bc9729b3fd7ef8b1a9facdcd0c6a8999ebfdbe15a28759c72f422ff35" exitCode=0 Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.540597 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbszm" event={"ID":"156aee71-ca2e-4c41-9ec6-6be508f8d024","Type":"ContainerDied","Data":"ab92609bc9729b3fd7ef8b1a9facdcd0c6a8999ebfdbe15a28759c72f422ff35"} Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.540673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbszm" event={"ID":"156aee71-ca2e-4c41-9ec6-6be508f8d024","Type":"ContainerStarted","Data":"faf1e318e68a9cdf35234c72c179b71ca7e99d459d1ba373244ad40be7a6f412"} Oct 06 15:06:34 crc kubenswrapper[4763]: I1006 15:06:34.614219 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:35 crc kubenswrapper[4763]: I1006 15:06:35.553423 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbszm" event={"ID":"156aee71-ca2e-4c41-9ec6-6be508f8d024","Type":"ContainerStarted","Data":"936251f2ef65c013e14a0bd8a213be36524704c4035cad4bf6ab1bcbc4a028bd"} Oct 06 15:06:36 crc kubenswrapper[4763]: I1006 15:06:36.562461 4763 generic.go:334] "Generic (PLEG): container finished" podID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerID="936251f2ef65c013e14a0bd8a213be36524704c4035cad4bf6ab1bcbc4a028bd" exitCode=0 Oct 06 15:06:36 crc kubenswrapper[4763]: I1006 15:06:36.562509 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbszm" event={"ID":"156aee71-ca2e-4c41-9ec6-6be508f8d024","Type":"ContainerDied","Data":"936251f2ef65c013e14a0bd8a213be36524704c4035cad4bf6ab1bcbc4a028bd"} Oct 06 15:06:37 crc kubenswrapper[4763]: I1006 15:06:37.566865 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gr2vs"] Oct 06 15:06:37 crc kubenswrapper[4763]: I1006 15:06:37.567811 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gr2vs" podUID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerName="registry-server" containerID="cri-o://d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87" gracePeriod=2 Oct 06 15:06:37 crc kubenswrapper[4763]: I1006 15:06:37.586069 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbszm" event={"ID":"156aee71-ca2e-4c41-9ec6-6be508f8d024","Type":"ContainerStarted","Data":"54e9cfa640317161f1fcea4a553d6541bea14951185ec0f665558e764bb3fdbe"} Oct 06 15:06:37 crc kubenswrapper[4763]: I1006 15:06:37.607290 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qbszm" podStartSLOduration=3.191507529 podStartE2EDuration="5.607261167s" podCreationTimestamp="2025-10-06 15:06:32 +0000 UTC" firstStartedPulling="2025-10-06 15:06:34.543442931 +0000 UTC m=+791.698735453" lastFinishedPulling="2025-10-06 15:06:36.959196569 +0000 UTC m=+794.114489091" observedRunningTime="2025-10-06 15:06:37.601214439 +0000 UTC m=+794.756506961" watchObservedRunningTime="2025-10-06 15:06:37.607261167 +0000 UTC m=+794.762553719" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.521844 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.553582 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsmgb\" (UniqueName: \"kubernetes.io/projected/2673a5a7-684d-422b-bca1-b1b8f79b31d1-kube-api-access-xsmgb\") pod \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.554560 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-catalog-content\") pod \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.554667 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-utilities\") pod \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\" (UID: \"2673a5a7-684d-422b-bca1-b1b8f79b31d1\") " Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.557068 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-utilities" (OuterVolumeSpecName: "utilities") pod "2673a5a7-684d-422b-bca1-b1b8f79b31d1" (UID: "2673a5a7-684d-422b-bca1-b1b8f79b31d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.592795 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2673a5a7-684d-422b-bca1-b1b8f79b31d1-kube-api-access-xsmgb" (OuterVolumeSpecName: "kube-api-access-xsmgb") pod "2673a5a7-684d-422b-bca1-b1b8f79b31d1" (UID: "2673a5a7-684d-422b-bca1-b1b8f79b31d1"). InnerVolumeSpecName "kube-api-access-xsmgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.623853 4763 generic.go:334] "Generic (PLEG): container finished" podID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerID="d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87" exitCode=0 Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.623957 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr2vs" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.624958 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2vs" event={"ID":"2673a5a7-684d-422b-bca1-b1b8f79b31d1","Type":"ContainerDied","Data":"d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87"} Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.625001 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2vs" event={"ID":"2673a5a7-684d-422b-bca1-b1b8f79b31d1","Type":"ContainerDied","Data":"a3edc7662492aeeb9573271d0f7d21b9bd541e1d67ccb0bc87e3ac794d42aed7"} Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.625025 4763 scope.go:117] "RemoveContainer" containerID="d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.647336 4763 scope.go:117] "RemoveContainer" containerID="b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.652322 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2673a5a7-684d-422b-bca1-b1b8f79b31d1" (UID: "2673a5a7-684d-422b-bca1-b1b8f79b31d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.656233 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.656260 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2673a5a7-684d-422b-bca1-b1b8f79b31d1-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.656272 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsmgb\" (UniqueName: \"kubernetes.io/projected/2673a5a7-684d-422b-bca1-b1b8f79b31d1-kube-api-access-xsmgb\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.673406 4763 scope.go:117] "RemoveContainer" containerID="3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.691771 4763 scope.go:117] "RemoveContainer" containerID="d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87" Oct 06 15:06:38 crc kubenswrapper[4763]: E1006 15:06:38.692205 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87\": container with ID starting with d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87 not found: ID does not exist" containerID="d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.692246 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87"} err="failed to get container status \"d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87\": rpc error: code = NotFound desc = could not find container \"d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87\": container with ID starting with d97331d091a11321b18a19878f1beea53dfb90227048217d169e259b91099f87 not found: ID does not exist" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.692277 4763 scope.go:117] "RemoveContainer" containerID="b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31" Oct 06 15:06:38 crc kubenswrapper[4763]: E1006 15:06:38.692489 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31\": container with ID starting with b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31 not found: ID does not exist" containerID="b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.692508 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31"} err="failed to get container status \"b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31\": rpc error: code = NotFound desc = could not find container \"b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31\": container with ID starting with b131001364efb37153010b876d87a0835ea039b4cd6340f32139c3e5c9c9ce31 not found: ID does not exist" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.692520 4763 scope.go:117] "RemoveContainer" containerID="3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a" Oct 06 15:06:38 crc kubenswrapper[4763]: E1006 15:06:38.693256 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a\": container with ID starting with 3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a not found: ID does not exist" containerID="3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.693283 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a"} err="failed to get container status \"3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a\": rpc error: code = NotFound desc = could not find container \"3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a\": container with ID starting with 3856d06538120e11e012db7e7ba84afbfe8a3992841665c3289e386dc7cc4d0a not found: ID does not exist" Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.958996 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gr2vs"] Oct 06 15:06:38 crc kubenswrapper[4763]: I1006 15:06:38.962708 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gr2vs"] Oct 06 15:06:39 crc kubenswrapper[4763]: I1006 15:06:39.581171 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" path="/var/lib/kubelet/pods/2673a5a7-684d-422b-bca1-b1b8f79b31d1/volumes" Oct 06 15:06:43 crc kubenswrapper[4763]: I1006 15:06:43.073479 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:43 crc kubenswrapper[4763]: I1006 15:06:43.074109 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:43 crc kubenswrapper[4763]: I1006 15:06:43.116133 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:43 crc kubenswrapper[4763]: I1006 15:06:43.705172 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.063025 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2"] Oct 06 15:06:44 crc kubenswrapper[4763]: E1006 15:06:44.063259 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerName="extract-utilities" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.063273 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerName="extract-utilities" Oct 06 15:06:44 crc kubenswrapper[4763]: E1006 15:06:44.063284 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerName="extract-content" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.063293 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerName="extract-content" Oct 06 15:06:44 crc kubenswrapper[4763]: E1006 15:06:44.063305 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a63df7-b6aa-47b5-a09c-0c8b4af726df" containerName="extract" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.063313 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a63df7-b6aa-47b5-a09c-0c8b4af726df" containerName="extract" Oct 06 15:06:44 crc kubenswrapper[4763]: E1006 15:06:44.063337 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a63df7-b6aa-47b5-a09c-0c8b4af726df" containerName="util" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.063344 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a63df7-b6aa-47b5-a09c-0c8b4af726df" containerName="util" Oct 06 15:06:44 crc kubenswrapper[4763]: E1006 15:06:44.063363 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a63df7-b6aa-47b5-a09c-0c8b4af726df" containerName="pull" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.063371 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a63df7-b6aa-47b5-a09c-0c8b4af726df" containerName="pull" Oct 06 15:06:44 crc kubenswrapper[4763]: E1006 15:06:44.063385 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerName="registry-server" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.063393 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerName="registry-server" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.063509 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a63df7-b6aa-47b5-a09c-0c8b4af726df" containerName="extract" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.063527 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2673a5a7-684d-422b-bca1-b1b8f79b31d1" containerName="registry-server" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.063983 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.066574 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.066777 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.067339 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.067499 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-8p4p2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.067854 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.077399 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2"] Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.094318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/728f41e8-a80b-4512-b663-fa525ae11afd-webhook-cert\") pod \"metallb-operator-controller-manager-5588db756d-q7lc2\" (UID: \"728f41e8-a80b-4512-b663-fa525ae11afd\") " pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.094358 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4hjv\" (UniqueName: \"kubernetes.io/projected/728f41e8-a80b-4512-b663-fa525ae11afd-kube-api-access-d4hjv\") pod \"metallb-operator-controller-manager-5588db756d-q7lc2\" (UID: \"728f41e8-a80b-4512-b663-fa525ae11afd\") " pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.094387 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/728f41e8-a80b-4512-b663-fa525ae11afd-apiservice-cert\") pod \"metallb-operator-controller-manager-5588db756d-q7lc2\" (UID: \"728f41e8-a80b-4512-b663-fa525ae11afd\") " pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.195284 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/728f41e8-a80b-4512-b663-fa525ae11afd-webhook-cert\") pod \"metallb-operator-controller-manager-5588db756d-q7lc2\" (UID: \"728f41e8-a80b-4512-b663-fa525ae11afd\") " pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.195328 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4hjv\" (UniqueName: \"kubernetes.io/projected/728f41e8-a80b-4512-b663-fa525ae11afd-kube-api-access-d4hjv\") pod \"metallb-operator-controller-manager-5588db756d-q7lc2\" (UID: \"728f41e8-a80b-4512-b663-fa525ae11afd\") " pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.195366 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/728f41e8-a80b-4512-b663-fa525ae11afd-apiservice-cert\") pod \"metallb-operator-controller-manager-5588db756d-q7lc2\" (UID: \"728f41e8-a80b-4512-b663-fa525ae11afd\") " pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.200701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/728f41e8-a80b-4512-b663-fa525ae11afd-apiservice-cert\") pod \"metallb-operator-controller-manager-5588db756d-q7lc2\" (UID: \"728f41e8-a80b-4512-b663-fa525ae11afd\") " pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.204127 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/728f41e8-a80b-4512-b663-fa525ae11afd-webhook-cert\") pod \"metallb-operator-controller-manager-5588db756d-q7lc2\" (UID: \"728f41e8-a80b-4512-b663-fa525ae11afd\") " pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.212676 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4hjv\" (UniqueName: \"kubernetes.io/projected/728f41e8-a80b-4512-b663-fa525ae11afd-kube-api-access-d4hjv\") pod \"metallb-operator-controller-manager-5588db756d-q7lc2\" (UID: \"728f41e8-a80b-4512-b663-fa525ae11afd\") " pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.294654 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl"] Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.295279 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.296080 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8-webhook-cert\") pod \"metallb-operator-webhook-server-f4bd4784-h52rl\" (UID: \"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8\") " pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.296125 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8-apiservice-cert\") pod \"metallb-operator-webhook-server-f4bd4784-h52rl\" (UID: \"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8\") " pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.296192 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89lmz\" (UniqueName: \"kubernetes.io/projected/e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8-kube-api-access-89lmz\") pod \"metallb-operator-webhook-server-f4bd4784-h52rl\" (UID: \"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8\") " pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: W1006 15:06:44.296790 4763 reflector.go:561] object-"metallb-system"/"metallb-webhook-cert": failed to list *v1.Secret: secrets "metallb-webhook-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 06 15:06:44 crc kubenswrapper[4763]: E1006 15:06:44.296822 4763 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-webhook-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-webhook-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.297349 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-drc9h" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.297923 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.319752 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl"] Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.379900 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.397669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8-webhook-cert\") pod \"metallb-operator-webhook-server-f4bd4784-h52rl\" (UID: \"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8\") " pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.397721 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8-apiservice-cert\") pod \"metallb-operator-webhook-server-f4bd4784-h52rl\" (UID: \"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8\") " pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.397763 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89lmz\" (UniqueName: \"kubernetes.io/projected/e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8-kube-api-access-89lmz\") pod \"metallb-operator-webhook-server-f4bd4784-h52rl\" (UID: \"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8\") " pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.401137 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8-apiservice-cert\") pod \"metallb-operator-webhook-server-f4bd4784-h52rl\" (UID: \"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8\") " pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.405651 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8-webhook-cert\") pod \"metallb-operator-webhook-server-f4bd4784-h52rl\" (UID: \"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8\") " pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.422264 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89lmz\" (UniqueName: \"kubernetes.io/projected/e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8-kube-api-access-89lmz\") pod \"metallb-operator-webhook-server-f4bd4784-h52rl\" (UID: \"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8\") " pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.609996 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:44 crc kubenswrapper[4763]: I1006 15:06:44.830073 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2"] Oct 06 15:06:44 crc kubenswrapper[4763]: W1006 15:06:44.838665 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod728f41e8_a80b_4512_b663_fa525ae11afd.slice/crio-1c03824249f6e766002c39bf98ba10393629b63a7fb848253994254a8472ff90 WatchSource:0}: Error finding container 1c03824249f6e766002c39bf98ba10393629b63a7fb848253994254a8472ff90: Status 404 returned error can't find the container with id 1c03824249f6e766002c39bf98ba10393629b63a7fb848253994254a8472ff90 Oct 06 15:06:45 crc kubenswrapper[4763]: I1006 15:06:45.023076 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl"] Oct 06 15:06:45 crc kubenswrapper[4763]: W1006 15:06:45.029476 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5bf63ac_288e_44d9_a8cb_5f8ade87c0e8.slice/crio-8f77173020ac7e82be92d93091d880c9b4cca7df34a4523371c654bbc53d490a WatchSource:0}: Error finding container 8f77173020ac7e82be92d93091d880c9b4cca7df34a4523371c654bbc53d490a: Status 404 returned error can't find the container with id 8f77173020ac7e82be92d93091d880c9b4cca7df34a4523371c654bbc53d490a Oct 06 15:06:45 crc kubenswrapper[4763]: I1006 15:06:45.293083 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 15:06:45 crc kubenswrapper[4763]: I1006 15:06:45.673833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" event={"ID":"728f41e8-a80b-4512-b663-fa525ae11afd","Type":"ContainerStarted","Data":"1c03824249f6e766002c39bf98ba10393629b63a7fb848253994254a8472ff90"} Oct 06 15:06:45 crc kubenswrapper[4763]: I1006 15:06:45.675463 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" event={"ID":"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8","Type":"ContainerStarted","Data":"8f77173020ac7e82be92d93091d880c9b4cca7df34a4523371c654bbc53d490a"} Oct 06 15:06:46 crc kubenswrapper[4763]: I1006 15:06:46.738496 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qbszm"] Oct 06 15:06:46 crc kubenswrapper[4763]: I1006 15:06:46.738777 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qbszm" podUID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerName="registry-server" containerID="cri-o://54e9cfa640317161f1fcea4a553d6541bea14951185ec0f665558e764bb3fdbe" gracePeriod=2 Oct 06 15:06:47 crc kubenswrapper[4763]: I1006 15:06:47.699076 4763 generic.go:334] "Generic (PLEG): container finished" podID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerID="54e9cfa640317161f1fcea4a553d6541bea14951185ec0f665558e764bb3fdbe" exitCode=0 Oct 06 15:06:47 crc kubenswrapper[4763]: I1006 15:06:47.699120 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbszm" event={"ID":"156aee71-ca2e-4c41-9ec6-6be508f8d024","Type":"ContainerDied","Data":"54e9cfa640317161f1fcea4a553d6541bea14951185ec0f665558e764bb3fdbe"} Oct 06 15:06:47 crc kubenswrapper[4763]: I1006 15:06:47.963514 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.052181 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-catalog-content\") pod \"156aee71-ca2e-4c41-9ec6-6be508f8d024\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.052236 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-utilities\") pod \"156aee71-ca2e-4c41-9ec6-6be508f8d024\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.052316 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55rkb\" (UniqueName: \"kubernetes.io/projected/156aee71-ca2e-4c41-9ec6-6be508f8d024-kube-api-access-55rkb\") pod \"156aee71-ca2e-4c41-9ec6-6be508f8d024\" (UID: \"156aee71-ca2e-4c41-9ec6-6be508f8d024\") " Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.053456 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-utilities" (OuterVolumeSpecName: "utilities") pod "156aee71-ca2e-4c41-9ec6-6be508f8d024" (UID: "156aee71-ca2e-4c41-9ec6-6be508f8d024"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.056952 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156aee71-ca2e-4c41-9ec6-6be508f8d024-kube-api-access-55rkb" (OuterVolumeSpecName: "kube-api-access-55rkb") pod "156aee71-ca2e-4c41-9ec6-6be508f8d024" (UID: "156aee71-ca2e-4c41-9ec6-6be508f8d024"). InnerVolumeSpecName "kube-api-access-55rkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.099774 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "156aee71-ca2e-4c41-9ec6-6be508f8d024" (UID: "156aee71-ca2e-4c41-9ec6-6be508f8d024"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.153814 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55rkb\" (UniqueName: \"kubernetes.io/projected/156aee71-ca2e-4c41-9ec6-6be508f8d024-kube-api-access-55rkb\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.153849 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.153863 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/156aee71-ca2e-4c41-9ec6-6be508f8d024-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.768561 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbszm" event={"ID":"156aee71-ca2e-4c41-9ec6-6be508f8d024","Type":"ContainerDied","Data":"faf1e318e68a9cdf35234c72c179b71ca7e99d459d1ba373244ad40be7a6f412"} Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.768662 4763 scope.go:117] "RemoveContainer" containerID="54e9cfa640317161f1fcea4a553d6541bea14951185ec0f665558e764bb3fdbe" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.768805 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbszm" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.773359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" event={"ID":"728f41e8-a80b-4512-b663-fa525ae11afd","Type":"ContainerStarted","Data":"bf3118299ed03099f247d980f97d1bb6776c59e38ba66ee57d70dad6a42eef79"} Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.773928 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.802320 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" podStartSLOduration=1.657207002 podStartE2EDuration="4.80229783s" podCreationTimestamp="2025-10-06 15:06:44 +0000 UTC" firstStartedPulling="2025-10-06 15:06:44.842121816 +0000 UTC m=+801.997414328" lastFinishedPulling="2025-10-06 15:06:47.987212634 +0000 UTC m=+805.142505156" observedRunningTime="2025-10-06 15:06:48.795300805 +0000 UTC m=+805.950593327" watchObservedRunningTime="2025-10-06 15:06:48.80229783 +0000 UTC m=+805.957590342" Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.819131 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qbszm"] Oct 06 15:06:48 crc kubenswrapper[4763]: I1006 15:06:48.824545 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qbszm"] Oct 06 15:06:49 crc kubenswrapper[4763]: I1006 15:06:49.583434 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156aee71-ca2e-4c41-9ec6-6be508f8d024" path="/var/lib/kubelet/pods/156aee71-ca2e-4c41-9ec6-6be508f8d024/volumes" Oct 06 15:06:50 crc kubenswrapper[4763]: I1006 15:06:50.056560 4763 scope.go:117] "RemoveContainer" containerID="936251f2ef65c013e14a0bd8a213be36524704c4035cad4bf6ab1bcbc4a028bd" Oct 06 15:06:50 crc kubenswrapper[4763]: I1006 15:06:50.110328 4763 scope.go:117] "RemoveContainer" containerID="ab92609bc9729b3fd7ef8b1a9facdcd0c6a8999ebfdbe15a28759c72f422ff35" Oct 06 15:06:50 crc kubenswrapper[4763]: I1006 15:06:50.789635 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" event={"ID":"e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8","Type":"ContainerStarted","Data":"4efcea407c9ded186d7d66008b658bcb20bba65dcfb875603bb5d858be907f18"} Oct 06 15:06:50 crc kubenswrapper[4763]: I1006 15:06:50.789796 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:06:50 crc kubenswrapper[4763]: I1006 15:06:50.815451 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" podStartSLOduration=1.720123394 podStartE2EDuration="6.815431713s" podCreationTimestamp="2025-10-06 15:06:44 +0000 UTC" firstStartedPulling="2025-10-06 15:06:45.032402452 +0000 UTC m=+802.187694964" lastFinishedPulling="2025-10-06 15:06:50.127710761 +0000 UTC m=+807.283003283" observedRunningTime="2025-10-06 15:06:50.810208977 +0000 UTC m=+807.965501499" watchObservedRunningTime="2025-10-06 15:06:50.815431713 +0000 UTC m=+807.970724235" Oct 06 15:07:03 crc kubenswrapper[4763]: I1006 15:07:03.876788 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:07:03 crc kubenswrapper[4763]: I1006 15:07:03.877691 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:07:04 crc kubenswrapper[4763]: I1006 15:07:04.619848 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-f4bd4784-h52rl" Oct 06 15:07:24 crc kubenswrapper[4763]: I1006 15:07:24.382233 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5588db756d-q7lc2" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.221346 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zgs9f"] Oct 06 15:07:25 crc kubenswrapper[4763]: E1006 15:07:25.221537 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerName="registry-server" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.221548 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerName="registry-server" Oct 06 15:07:25 crc kubenswrapper[4763]: E1006 15:07:25.221557 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerName="extract-utilities" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.221563 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerName="extract-utilities" Oct 06 15:07:25 crc kubenswrapper[4763]: E1006 15:07:25.221577 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerName="extract-content" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.221583 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerName="extract-content" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.221703 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="156aee71-ca2e-4c41-9ec6-6be508f8d024" containerName="registry-server" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.223403 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.226398 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4mzqm" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.226660 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.227209 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.229675 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb"] Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.230457 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.246745 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb"] Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.255970 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.326383 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-rlg6x"] Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.327379 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.329687 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.329845 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.330141 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.330331 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-w4l57" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.346373 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-tfsjz"] Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.347113 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.348282 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352003 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/897ea117-5211-437f-8ef9-206a43a00850-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zc5sb\" (UID: \"897ea117-5211-437f-8ef9-206a43a00850\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352052 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5f4c\" (UniqueName: \"kubernetes.io/projected/897ea117-5211-437f-8ef9-206a43a00850-kube-api-access-w5f4c\") pod \"frr-k8s-webhook-server-64bf5d555-zc5sb\" (UID: \"897ea117-5211-437f-8ef9-206a43a00850\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352095 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mm8r\" (UniqueName: \"kubernetes.io/projected/f153a5a1-097a-4743-91df-8aad1eb19149-kube-api-access-7mm8r\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352119 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-memberlist\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352155 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ndjs\" (UniqueName: \"kubernetes.io/projected/52461501-c1a5-4e78-8ed6-2d33efa547bf-kube-api-access-5ndjs\") pod \"controller-68d546b9d8-tfsjz\" (UID: \"52461501-c1a5-4e78-8ed6-2d33efa547bf\") " pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352176 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52461501-c1a5-4e78-8ed6-2d33efa547bf-cert\") pod \"controller-68d546b9d8-tfsjz\" (UID: \"52461501-c1a5-4e78-8ed6-2d33efa547bf\") " pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-reloader\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352222 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-frr-sockets\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352328 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-metrics-certs\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352399 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-metrics\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352424 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tslh6\" (UniqueName: \"kubernetes.io/projected/88887d3f-347d-41bd-9e25-1dcfa97b2175-kube-api-access-tslh6\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352445 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/88887d3f-347d-41bd-9e25-1dcfa97b2175-frr-startup\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352561 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f153a5a1-097a-4743-91df-8aad1eb19149-metallb-excludel2\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52461501-c1a5-4e78-8ed6-2d33efa547bf-metrics-certs\") pod \"controller-68d546b9d8-tfsjz\" (UID: \"52461501-c1a5-4e78-8ed6-2d33efa547bf\") " pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-frr-conf\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.352678 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88887d3f-347d-41bd-9e25-1dcfa97b2175-metrics-certs\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.361375 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-tfsjz"] Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453230 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/897ea117-5211-437f-8ef9-206a43a00850-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zc5sb\" (UID: \"897ea117-5211-437f-8ef9-206a43a00850\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453279 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5f4c\" (UniqueName: \"kubernetes.io/projected/897ea117-5211-437f-8ef9-206a43a00850-kube-api-access-w5f4c\") pod \"frr-k8s-webhook-server-64bf5d555-zc5sb\" (UID: \"897ea117-5211-437f-8ef9-206a43a00850\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453306 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mm8r\" (UniqueName: \"kubernetes.io/projected/f153a5a1-097a-4743-91df-8aad1eb19149-kube-api-access-7mm8r\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453323 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-memberlist\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453350 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ndjs\" (UniqueName: \"kubernetes.io/projected/52461501-c1a5-4e78-8ed6-2d33efa547bf-kube-api-access-5ndjs\") pod \"controller-68d546b9d8-tfsjz\" (UID: \"52461501-c1a5-4e78-8ed6-2d33efa547bf\") " pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453367 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52461501-c1a5-4e78-8ed6-2d33efa547bf-cert\") pod \"controller-68d546b9d8-tfsjz\" (UID: \"52461501-c1a5-4e78-8ed6-2d33efa547bf\") " pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453381 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-reloader\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453397 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-frr-sockets\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453411 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-metrics-certs\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453432 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-metrics\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453448 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tslh6\" (UniqueName: \"kubernetes.io/projected/88887d3f-347d-41bd-9e25-1dcfa97b2175-kube-api-access-tslh6\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453463 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/88887d3f-347d-41bd-9e25-1dcfa97b2175-frr-startup\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453486 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f153a5a1-097a-4743-91df-8aad1eb19149-metallb-excludel2\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453507 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52461501-c1a5-4e78-8ed6-2d33efa547bf-metrics-certs\") pod \"controller-68d546b9d8-tfsjz\" (UID: \"52461501-c1a5-4e78-8ed6-2d33efa547bf\") " pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453523 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-frr-conf\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.453541 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88887d3f-347d-41bd-9e25-1dcfa97b2175-metrics-certs\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: E1006 15:07:25.453990 4763 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 06 15:07:25 crc kubenswrapper[4763]: E1006 15:07:25.454017 4763 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 15:07:25 crc kubenswrapper[4763]: E1006 15:07:25.454062 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52461501-c1a5-4e78-8ed6-2d33efa547bf-metrics-certs podName:52461501-c1a5-4e78-8ed6-2d33efa547bf nodeName:}" failed. No retries permitted until 2025-10-06 15:07:25.954041566 +0000 UTC m=+843.109334078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52461501-c1a5-4e78-8ed6-2d33efa547bf-metrics-certs") pod "controller-68d546b9d8-tfsjz" (UID: "52461501-c1a5-4e78-8ed6-2d33efa547bf") : secret "controller-certs-secret" not found Oct 06 15:07:25 crc kubenswrapper[4763]: E1006 15:07:25.454084 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-memberlist podName:f153a5a1-097a-4743-91df-8aad1eb19149 nodeName:}" failed. No retries permitted until 2025-10-06 15:07:25.954075097 +0000 UTC m=+843.109367739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-memberlist") pod "speaker-rlg6x" (UID: "f153a5a1-097a-4743-91df-8aad1eb19149") : secret "metallb-memberlist" not found Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.454476 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-reloader\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.454562 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f153a5a1-097a-4743-91df-8aad1eb19149-metallb-excludel2\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.455012 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-frr-conf\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.455017 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-metrics\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.457147 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/88887d3f-347d-41bd-9e25-1dcfa97b2175-frr-startup\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.457229 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/88887d3f-347d-41bd-9e25-1dcfa97b2175-frr-sockets\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.458513 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.459313 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-metrics-certs\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.462195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/897ea117-5211-437f-8ef9-206a43a00850-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zc5sb\" (UID: \"897ea117-5211-437f-8ef9-206a43a00850\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.462750 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88887d3f-347d-41bd-9e25-1dcfa97b2175-metrics-certs\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.479034 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52461501-c1a5-4e78-8ed6-2d33efa547bf-cert\") pod \"controller-68d546b9d8-tfsjz\" (UID: \"52461501-c1a5-4e78-8ed6-2d33efa547bf\") " pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.481379 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tslh6\" (UniqueName: \"kubernetes.io/projected/88887d3f-347d-41bd-9e25-1dcfa97b2175-kube-api-access-tslh6\") pod \"frr-k8s-zgs9f\" (UID: \"88887d3f-347d-41bd-9e25-1dcfa97b2175\") " pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.481733 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ndjs\" (UniqueName: \"kubernetes.io/projected/52461501-c1a5-4e78-8ed6-2d33efa547bf-kube-api-access-5ndjs\") pod \"controller-68d546b9d8-tfsjz\" (UID: \"52461501-c1a5-4e78-8ed6-2d33efa547bf\") " pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.481859 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mm8r\" (UniqueName: \"kubernetes.io/projected/f153a5a1-097a-4743-91df-8aad1eb19149-kube-api-access-7mm8r\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.484739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5f4c\" (UniqueName: \"kubernetes.io/projected/897ea117-5211-437f-8ef9-206a43a00850-kube-api-access-w5f4c\") pod \"frr-k8s-webhook-server-64bf5d555-zc5sb\" (UID: \"897ea117-5211-437f-8ef9-206a43a00850\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.550299 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.559784 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.960740 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52461501-c1a5-4e78-8ed6-2d33efa547bf-metrics-certs\") pod \"controller-68d546b9d8-tfsjz\" (UID: \"52461501-c1a5-4e78-8ed6-2d33efa547bf\") " pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.961015 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-memberlist\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:25 crc kubenswrapper[4763]: E1006 15:07:25.961150 4763 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 15:07:25 crc kubenswrapper[4763]: E1006 15:07:25.961193 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-memberlist podName:f153a5a1-097a-4743-91df-8aad1eb19149 nodeName:}" failed. No retries permitted until 2025-10-06 15:07:26.961180251 +0000 UTC m=+844.116472763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-memberlist") pod "speaker-rlg6x" (UID: "f153a5a1-097a-4743-91df-8aad1eb19149") : secret "metallb-memberlist" not found Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.964972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52461501-c1a5-4e78-8ed6-2d33efa547bf-metrics-certs\") pod \"controller-68d546b9d8-tfsjz\" (UID: \"52461501-c1a5-4e78-8ed6-2d33efa547bf\") " pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:25 crc kubenswrapper[4763]: I1006 15:07:25.988967 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb"] Oct 06 15:07:26 crc kubenswrapper[4763]: I1006 15:07:26.037081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zgs9f" event={"ID":"88887d3f-347d-41bd-9e25-1dcfa97b2175","Type":"ContainerStarted","Data":"d5ec5ddbc5898623ad6d12de63fc4fdae34d04969f2d9c43c7fcdf15df7e7dad"} Oct 06 15:07:26 crc kubenswrapper[4763]: I1006 15:07:26.038439 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" event={"ID":"897ea117-5211-437f-8ef9-206a43a00850","Type":"ContainerStarted","Data":"71320310bdc6f5144791c2e7e01b76c9df1f8820cf39bff00200e606df0cb60c"} Oct 06 15:07:26 crc kubenswrapper[4763]: I1006 15:07:26.263525 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:26 crc kubenswrapper[4763]: I1006 15:07:26.536485 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-tfsjz"] Oct 06 15:07:26 crc kubenswrapper[4763]: I1006 15:07:26.974412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-memberlist\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:26 crc kubenswrapper[4763]: I1006 15:07:26.985560 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f153a5a1-097a-4743-91df-8aad1eb19149-memberlist\") pod \"speaker-rlg6x\" (UID: \"f153a5a1-097a-4743-91df-8aad1eb19149\") " pod="metallb-system/speaker-rlg6x" Oct 06 15:07:27 crc kubenswrapper[4763]: I1006 15:07:27.055291 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-tfsjz" event={"ID":"52461501-c1a5-4e78-8ed6-2d33efa547bf","Type":"ContainerStarted","Data":"105a9fcd7fe569c8b908c65fe7f27f539740599d3c5ab3ef656180f2162999fe"} Oct 06 15:07:27 crc kubenswrapper[4763]: I1006 15:07:27.055373 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-tfsjz" event={"ID":"52461501-c1a5-4e78-8ed6-2d33efa547bf","Type":"ContainerStarted","Data":"75a004bec5f79c9aac0c458e2fe5c4a76d32ce79558fa8352b2515a5ad44ce83"} Oct 06 15:07:27 crc kubenswrapper[4763]: I1006 15:07:27.055519 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-tfsjz" event={"ID":"52461501-c1a5-4e78-8ed6-2d33efa547bf","Type":"ContainerStarted","Data":"fff764c178e93875c98fca28a53aecd1a4a9f98150f96ca3e6432b54482f7669"} Oct 06 15:07:27 crc kubenswrapper[4763]: I1006 15:07:27.055588 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:27 crc kubenswrapper[4763]: I1006 15:07:27.077019 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-tfsjz" podStartSLOduration=2.077002228 podStartE2EDuration="2.077002228s" podCreationTimestamp="2025-10-06 15:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:07:27.075907417 +0000 UTC m=+844.231199939" watchObservedRunningTime="2025-10-06 15:07:27.077002228 +0000 UTC m=+844.232294750" Oct 06 15:07:27 crc kubenswrapper[4763]: I1006 15:07:27.142133 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rlg6x" Oct 06 15:07:27 crc kubenswrapper[4763]: W1006 15:07:27.161602 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf153a5a1_097a_4743_91df_8aad1eb19149.slice/crio-0c65976cdd5ff25fe2501d8426236b490e5a850078fa1187b8e12b46e2112b1b WatchSource:0}: Error finding container 0c65976cdd5ff25fe2501d8426236b490e5a850078fa1187b8e12b46e2112b1b: Status 404 returned error can't find the container with id 0c65976cdd5ff25fe2501d8426236b490e5a850078fa1187b8e12b46e2112b1b Oct 06 15:07:28 crc kubenswrapper[4763]: I1006 15:07:28.062275 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rlg6x" event={"ID":"f153a5a1-097a-4743-91df-8aad1eb19149","Type":"ContainerStarted","Data":"38f5f43835e64e96e83f9fccff1a506839156dd1ea7b94c2658907425d694183"} Oct 06 15:07:28 crc kubenswrapper[4763]: I1006 15:07:28.062577 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rlg6x" event={"ID":"f153a5a1-097a-4743-91df-8aad1eb19149","Type":"ContainerStarted","Data":"66be9b7b8cfec7bf32a0199d3c791e645e506d67740626ba5c65ece0bdcabb5b"} Oct 06 15:07:28 crc kubenswrapper[4763]: I1006 15:07:28.062588 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rlg6x" event={"ID":"f153a5a1-097a-4743-91df-8aad1eb19149","Type":"ContainerStarted","Data":"0c65976cdd5ff25fe2501d8426236b490e5a850078fa1187b8e12b46e2112b1b"} Oct 06 15:07:28 crc kubenswrapper[4763]: I1006 15:07:28.062743 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-rlg6x" Oct 06 15:07:28 crc kubenswrapper[4763]: I1006 15:07:28.079854 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-rlg6x" podStartSLOduration=3.0798345 podStartE2EDuration="3.0798345s" podCreationTimestamp="2025-10-06 15:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:07:28.077908326 +0000 UTC m=+845.233200858" watchObservedRunningTime="2025-10-06 15:07:28.0798345 +0000 UTC m=+845.235127012" Oct 06 15:07:31 crc kubenswrapper[4763]: I1006 15:07:31.836230 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5jpc8"] Oct 06 15:07:31 crc kubenswrapper[4763]: I1006 15:07:31.837424 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:31 crc kubenswrapper[4763]: I1006 15:07:31.848064 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jpc8"] Oct 06 15:07:31 crc kubenswrapper[4763]: I1006 15:07:31.936402 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-utilities\") pod \"certified-operators-5jpc8\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:31 crc kubenswrapper[4763]: I1006 15:07:31.936453 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wglhc\" (UniqueName: \"kubernetes.io/projected/ed48753b-ca22-4e34-904d-dcc14936fd87-kube-api-access-wglhc\") pod \"certified-operators-5jpc8\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:31 crc kubenswrapper[4763]: I1006 15:07:31.936671 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-catalog-content\") pod \"certified-operators-5jpc8\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:32 crc kubenswrapper[4763]: I1006 15:07:32.047939 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-catalog-content\") pod \"certified-operators-5jpc8\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:32 crc kubenswrapper[4763]: I1006 15:07:32.048020 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-utilities\") pod \"certified-operators-5jpc8\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:32 crc kubenswrapper[4763]: I1006 15:07:32.048073 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wglhc\" (UniqueName: \"kubernetes.io/projected/ed48753b-ca22-4e34-904d-dcc14936fd87-kube-api-access-wglhc\") pod \"certified-operators-5jpc8\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:32 crc kubenswrapper[4763]: I1006 15:07:32.048545 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-catalog-content\") pod \"certified-operators-5jpc8\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:32 crc kubenswrapper[4763]: I1006 15:07:32.048575 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-utilities\") pod \"certified-operators-5jpc8\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:32 crc kubenswrapper[4763]: I1006 15:07:32.081461 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wglhc\" (UniqueName: \"kubernetes.io/projected/ed48753b-ca22-4e34-904d-dcc14936fd87-kube-api-access-wglhc\") pod \"certified-operators-5jpc8\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:32 crc kubenswrapper[4763]: I1006 15:07:32.155293 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:33 crc kubenswrapper[4763]: I1006 15:07:33.050198 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jpc8"] Oct 06 15:07:33 crc kubenswrapper[4763]: I1006 15:07:33.096380 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jpc8" event={"ID":"ed48753b-ca22-4e34-904d-dcc14936fd87","Type":"ContainerStarted","Data":"8a9b322610429d461974a724e889dcff4085c2f9eafce13cab8c331f6004d8c7"} Oct 06 15:07:33 crc kubenswrapper[4763]: I1006 15:07:33.876767 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:07:33 crc kubenswrapper[4763]: I1006 15:07:33.876856 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:07:33 crc kubenswrapper[4763]: I1006 15:07:33.876920 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:07:33 crc kubenswrapper[4763]: I1006 15:07:33.877703 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b167bea82fcc2f3729a095299a58826cc2314cb35b4e8eb0ed7c680899b999c"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:07:33 crc kubenswrapper[4763]: I1006 15:07:33.877809 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://0b167bea82fcc2f3729a095299a58826cc2314cb35b4e8eb0ed7c680899b999c" gracePeriod=600 Oct 06 15:07:34 crc kubenswrapper[4763]: I1006 15:07:34.107386 4763 generic.go:334] "Generic (PLEG): container finished" podID="88887d3f-347d-41bd-9e25-1dcfa97b2175" containerID="b759a26e72fe721888d2d0ded91bf5f6d5417cc47ef585cd435fe91b5f09325b" exitCode=0 Oct 06 15:07:34 crc kubenswrapper[4763]: I1006 15:07:34.107533 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zgs9f" event={"ID":"88887d3f-347d-41bd-9e25-1dcfa97b2175","Type":"ContainerDied","Data":"b759a26e72fe721888d2d0ded91bf5f6d5417cc47ef585cd435fe91b5f09325b"} Oct 06 15:07:34 crc kubenswrapper[4763]: I1006 15:07:34.116158 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="0b167bea82fcc2f3729a095299a58826cc2314cb35b4e8eb0ed7c680899b999c" exitCode=0 Oct 06 15:07:34 crc kubenswrapper[4763]: I1006 15:07:34.116270 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"0b167bea82fcc2f3729a095299a58826cc2314cb35b4e8eb0ed7c680899b999c"} Oct 06 15:07:34 crc kubenswrapper[4763]: I1006 15:07:34.116325 4763 scope.go:117] "RemoveContainer" containerID="069367f1b19c60d5af5d9d31fac016bc92d6f93455be97f8f4ea4d72267ad7aa" Oct 06 15:07:34 crc kubenswrapper[4763]: I1006 15:07:34.143636 4763 generic.go:334] "Generic (PLEG): container finished" podID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerID="d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11" exitCode=0 Oct 06 15:07:34 crc kubenswrapper[4763]: I1006 15:07:34.143734 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jpc8" event={"ID":"ed48753b-ca22-4e34-904d-dcc14936fd87","Type":"ContainerDied","Data":"d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11"} Oct 06 15:07:34 crc kubenswrapper[4763]: I1006 15:07:34.160891 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" event={"ID":"897ea117-5211-437f-8ef9-206a43a00850","Type":"ContainerStarted","Data":"5e41cbeb783d492f38ba3fe0cc11289393808d73055fd85caa1e992870e988c6"} Oct 06 15:07:34 crc kubenswrapper[4763]: I1006 15:07:34.161553 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" Oct 06 15:07:35 crc kubenswrapper[4763]: I1006 15:07:35.168543 4763 generic.go:334] "Generic (PLEG): container finished" podID="88887d3f-347d-41bd-9e25-1dcfa97b2175" containerID="19dca5b6c263c784391e21cb9efdbf4b28368c74fda1c78c1118268f98fe8f30" exitCode=0 Oct 06 15:07:35 crc kubenswrapper[4763]: I1006 15:07:35.168648 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zgs9f" event={"ID":"88887d3f-347d-41bd-9e25-1dcfa97b2175","Type":"ContainerDied","Data":"19dca5b6c263c784391e21cb9efdbf4b28368c74fda1c78c1118268f98fe8f30"} Oct 06 15:07:35 crc kubenswrapper[4763]: I1006 15:07:35.172402 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"6fbccdc9483352b1f55c48bbe8b493186e2536c8da1ef82629a7b6dcba09e9ea"} Oct 06 15:07:35 crc kubenswrapper[4763]: I1006 15:07:35.173891 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jpc8" event={"ID":"ed48753b-ca22-4e34-904d-dcc14936fd87","Type":"ContainerStarted","Data":"adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da"} Oct 06 15:07:35 crc kubenswrapper[4763]: I1006 15:07:35.201085 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" podStartSLOduration=3.261030524 podStartE2EDuration="10.201059747s" podCreationTimestamp="2025-10-06 15:07:25 +0000 UTC" firstStartedPulling="2025-10-06 15:07:25.998329815 +0000 UTC m=+843.153622347" lastFinishedPulling="2025-10-06 15:07:32.938359018 +0000 UTC m=+850.093651570" observedRunningTime="2025-10-06 15:07:34.19771875 +0000 UTC m=+851.353011262" watchObservedRunningTime="2025-10-06 15:07:35.201059747 +0000 UTC m=+852.356352269" Oct 06 15:07:36 crc kubenswrapper[4763]: I1006 15:07:36.185227 4763 generic.go:334] "Generic (PLEG): container finished" podID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerID="adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da" exitCode=0 Oct 06 15:07:36 crc kubenswrapper[4763]: I1006 15:07:36.185377 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jpc8" event={"ID":"ed48753b-ca22-4e34-904d-dcc14936fd87","Type":"ContainerDied","Data":"adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da"} Oct 06 15:07:36 crc kubenswrapper[4763]: I1006 15:07:36.190595 4763 generic.go:334] "Generic (PLEG): container finished" podID="88887d3f-347d-41bd-9e25-1dcfa97b2175" containerID="a73500cd8d488c6f5cf891aad13c5180f2628c104d6fbcda0fb98bc6834cf999" exitCode=0 Oct 06 15:07:36 crc kubenswrapper[4763]: I1006 15:07:36.190948 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zgs9f" event={"ID":"88887d3f-347d-41bd-9e25-1dcfa97b2175","Type":"ContainerDied","Data":"a73500cd8d488c6f5cf891aad13c5180f2628c104d6fbcda0fb98bc6834cf999"} Oct 06 15:07:36 crc kubenswrapper[4763]: I1006 15:07:36.267965 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-tfsjz" Oct 06 15:07:37 crc kubenswrapper[4763]: I1006 15:07:37.147527 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-rlg6x" Oct 06 15:07:37 crc kubenswrapper[4763]: I1006 15:07:37.204556 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zgs9f" event={"ID":"88887d3f-347d-41bd-9e25-1dcfa97b2175","Type":"ContainerStarted","Data":"e041b48ba5bcf0b3acab7f8bdf86a66e0a136b32000abeb092a1ee429c6ddb6b"} Oct 06 15:07:37 crc kubenswrapper[4763]: I1006 15:07:37.204610 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zgs9f" event={"ID":"88887d3f-347d-41bd-9e25-1dcfa97b2175","Type":"ContainerStarted","Data":"a07d374e40869a672ec12792ff8474c4e207d4fd7c6473406f69e58312bfeeda"} Oct 06 15:07:37 crc kubenswrapper[4763]: I1006 15:07:37.204670 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zgs9f" event={"ID":"88887d3f-347d-41bd-9e25-1dcfa97b2175","Type":"ContainerStarted","Data":"9fda6a051ea89996ea6df01e46c162217b9e2970f26fba8413b482fc73e55e88"} Oct 06 15:07:37 crc kubenswrapper[4763]: I1006 15:07:37.204686 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zgs9f" event={"ID":"88887d3f-347d-41bd-9e25-1dcfa97b2175","Type":"ContainerStarted","Data":"642ffdb5fd31da0c231f79345b139c7b9d6822ff9edda1a000d18e392a0e43e5"} Oct 06 15:07:37 crc kubenswrapper[4763]: I1006 15:07:37.204701 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zgs9f" event={"ID":"88887d3f-347d-41bd-9e25-1dcfa97b2175","Type":"ContainerStarted","Data":"2419f49ff6946f7dc32475cced845cdf354df7a6f43c8c9de8123c394b3bf9c6"} Oct 06 15:07:37 crc kubenswrapper[4763]: I1006 15:07:37.206685 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jpc8" event={"ID":"ed48753b-ca22-4e34-904d-dcc14936fd87","Type":"ContainerStarted","Data":"dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab"} Oct 06 15:07:37 crc kubenswrapper[4763]: I1006 15:07:37.228695 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5jpc8" podStartSLOduration=3.5343655800000002 podStartE2EDuration="6.228671432s" podCreationTimestamp="2025-10-06 15:07:31 +0000 UTC" firstStartedPulling="2025-10-06 15:07:34.147994556 +0000 UTC m=+851.303287078" lastFinishedPulling="2025-10-06 15:07:36.842300418 +0000 UTC m=+853.997592930" observedRunningTime="2025-10-06 15:07:37.226267805 +0000 UTC m=+854.381560357" watchObservedRunningTime="2025-10-06 15:07:37.228671432 +0000 UTC m=+854.383963994" Oct 06 15:07:38 crc kubenswrapper[4763]: I1006 15:07:38.217235 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zgs9f" event={"ID":"88887d3f-347d-41bd-9e25-1dcfa97b2175","Type":"ContainerStarted","Data":"d5aae57077fad31834022c41e3fa005924a9dc88e683b5ad4893ae1c6372cd59"} Oct 06 15:07:38 crc kubenswrapper[4763]: I1006 15:07:38.235609 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zgs9f" podStartSLOduration=6.028876841 podStartE2EDuration="13.235587177s" podCreationTimestamp="2025-10-06 15:07:25 +0000 UTC" firstStartedPulling="2025-10-06 15:07:25.69920768 +0000 UTC m=+842.854500202" lastFinishedPulling="2025-10-06 15:07:32.905918026 +0000 UTC m=+850.061210538" observedRunningTime="2025-10-06 15:07:38.234674902 +0000 UTC m=+855.389967424" watchObservedRunningTime="2025-10-06 15:07:38.235587177 +0000 UTC m=+855.390879699" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.090628 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp"] Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.092447 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.094194 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp"] Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.097076 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.146244 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjzgz\" (UniqueName: \"kubernetes.io/projected/f596e576-84ee-4cf9-9a96-c342d16ece3b-kube-api-access-wjzgz\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.146347 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.146381 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.222682 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.247771 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.247818 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.247905 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjzgz\" (UniqueName: \"kubernetes.io/projected/f596e576-84ee-4cf9-9a96-c342d16ece3b-kube-api-access-wjzgz\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.249117 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.249420 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.269286 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjzgz\" (UniqueName: \"kubernetes.io/projected/f596e576-84ee-4cf9-9a96-c342d16ece3b-kube-api-access-wjzgz\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.409731 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:39 crc kubenswrapper[4763]: I1006 15:07:39.884429 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp"] Oct 06 15:07:40 crc kubenswrapper[4763]: I1006 15:07:40.230189 4763 generic.go:334] "Generic (PLEG): container finished" podID="f596e576-84ee-4cf9-9a96-c342d16ece3b" containerID="c87a475df2e516431db2f4b1732edc71e88b63d1b6b577cbce90097a5d9efdec" exitCode=0 Oct 06 15:07:40 crc kubenswrapper[4763]: I1006 15:07:40.230410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" event={"ID":"f596e576-84ee-4cf9-9a96-c342d16ece3b","Type":"ContainerDied","Data":"c87a475df2e516431db2f4b1732edc71e88b63d1b6b577cbce90097a5d9efdec"} Oct 06 15:07:40 crc kubenswrapper[4763]: I1006 15:07:40.231379 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" event={"ID":"f596e576-84ee-4cf9-9a96-c342d16ece3b","Type":"ContainerStarted","Data":"97d539c06125d24c6e05587068fbc8e2ac71b01c2fbab0fd5f15b15b7e20a898"} Oct 06 15:07:40 crc kubenswrapper[4763]: I1006 15:07:40.550851 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:40 crc kubenswrapper[4763]: I1006 15:07:40.603077 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:42 crc kubenswrapper[4763]: I1006 15:07:42.155463 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:42 crc kubenswrapper[4763]: I1006 15:07:42.155564 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:42 crc kubenswrapper[4763]: I1006 15:07:42.213179 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:42 crc kubenswrapper[4763]: I1006 15:07:42.287962 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:44 crc kubenswrapper[4763]: I1006 15:07:44.257909 4763 generic.go:334] "Generic (PLEG): container finished" podID="f596e576-84ee-4cf9-9a96-c342d16ece3b" containerID="e0ca8f4418c29fb20461f4dd126be9cc1693a59f6066842186864b6483b2b5a8" exitCode=0 Oct 06 15:07:44 crc kubenswrapper[4763]: I1006 15:07:44.257951 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" event={"ID":"f596e576-84ee-4cf9-9a96-c342d16ece3b","Type":"ContainerDied","Data":"e0ca8f4418c29fb20461f4dd126be9cc1693a59f6066842186864b6483b2b5a8"} Oct 06 15:07:44 crc kubenswrapper[4763]: I1006 15:07:44.619141 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jpc8"] Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.267507 4763 generic.go:334] "Generic (PLEG): container finished" podID="f596e576-84ee-4cf9-9a96-c342d16ece3b" containerID="9c9212864e7e56f73801084e08f1abd9b1b75bf9073a86a75ea1a8e6f168f98f" exitCode=0 Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.267662 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" event={"ID":"f596e576-84ee-4cf9-9a96-c342d16ece3b","Type":"ContainerDied","Data":"9c9212864e7e56f73801084e08f1abd9b1b75bf9073a86a75ea1a8e6f168f98f"} Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.267835 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5jpc8" podUID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerName="registry-server" containerID="cri-o://dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab" gracePeriod=2 Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.569275 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zc5sb" Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.748163 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.840368 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-catalog-content\") pod \"ed48753b-ca22-4e34-904d-dcc14936fd87\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.840472 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-utilities\") pod \"ed48753b-ca22-4e34-904d-dcc14936fd87\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.840531 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wglhc\" (UniqueName: \"kubernetes.io/projected/ed48753b-ca22-4e34-904d-dcc14936fd87-kube-api-access-wglhc\") pod \"ed48753b-ca22-4e34-904d-dcc14936fd87\" (UID: \"ed48753b-ca22-4e34-904d-dcc14936fd87\") " Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.842576 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-utilities" (OuterVolumeSpecName: "utilities") pod "ed48753b-ca22-4e34-904d-dcc14936fd87" (UID: "ed48753b-ca22-4e34-904d-dcc14936fd87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.849452 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed48753b-ca22-4e34-904d-dcc14936fd87-kube-api-access-wglhc" (OuterVolumeSpecName: "kube-api-access-wglhc") pod "ed48753b-ca22-4e34-904d-dcc14936fd87" (UID: "ed48753b-ca22-4e34-904d-dcc14936fd87"). InnerVolumeSpecName "kube-api-access-wglhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.905737 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed48753b-ca22-4e34-904d-dcc14936fd87" (UID: "ed48753b-ca22-4e34-904d-dcc14936fd87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.942497 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.942554 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wglhc\" (UniqueName: \"kubernetes.io/projected/ed48753b-ca22-4e34-904d-dcc14936fd87-kube-api-access-wglhc\") on node \"crc\" DevicePath \"\"" Oct 06 15:07:45 crc kubenswrapper[4763]: I1006 15:07:45.942573 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed48753b-ca22-4e34-904d-dcc14936fd87-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.276775 4763 generic.go:334] "Generic (PLEG): container finished" podID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerID="dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab" exitCode=0 Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.276914 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jpc8" event={"ID":"ed48753b-ca22-4e34-904d-dcc14936fd87","Type":"ContainerDied","Data":"dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab"} Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.276976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jpc8" event={"ID":"ed48753b-ca22-4e34-904d-dcc14936fd87","Type":"ContainerDied","Data":"8a9b322610429d461974a724e889dcff4085c2f9eafce13cab8c331f6004d8c7"} Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.277006 4763 scope.go:117] "RemoveContainer" containerID="dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.278786 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jpc8" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.306236 4763 scope.go:117] "RemoveContainer" containerID="adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.339487 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jpc8"] Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.346548 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5jpc8"] Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.358379 4763 scope.go:117] "RemoveContainer" containerID="d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.375173 4763 scope.go:117] "RemoveContainer" containerID="dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab" Oct 06 15:07:46 crc kubenswrapper[4763]: E1006 15:07:46.375515 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab\": container with ID starting with dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab not found: ID does not exist" containerID="dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.375544 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab"} err="failed to get container status \"dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab\": rpc error: code = NotFound desc = could not find container \"dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab\": container with ID starting with dc149ba0a275e53b15874116a9667619c31141a6eda9fa3d68437776c0dd75ab not found: ID does not exist" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.375564 4763 scope.go:117] "RemoveContainer" containerID="adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da" Oct 06 15:07:46 crc kubenswrapper[4763]: E1006 15:07:46.375969 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da\": container with ID starting with adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da not found: ID does not exist" containerID="adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.375998 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da"} err="failed to get container status \"adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da\": rpc error: code = NotFound desc = could not find container \"adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da\": container with ID starting with adacbbef2e11371375265de74266e7f1feb752ed5ff9ba5f6226e17a697da6da not found: ID does not exist" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.376014 4763 scope.go:117] "RemoveContainer" containerID="d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11" Oct 06 15:07:46 crc kubenswrapper[4763]: E1006 15:07:46.376267 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11\": container with ID starting with d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11 not found: ID does not exist" containerID="d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.376293 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11"} err="failed to get container status \"d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11\": rpc error: code = NotFound desc = could not find container \"d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11\": container with ID starting with d99e3d4e5c32428b1e5a4d4ad56a341c51f79218e4f710842482d519203deb11 not found: ID does not exist" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.580393 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.652991 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-util\") pod \"f596e576-84ee-4cf9-9a96-c342d16ece3b\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.653144 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjzgz\" (UniqueName: \"kubernetes.io/projected/f596e576-84ee-4cf9-9a96-c342d16ece3b-kube-api-access-wjzgz\") pod \"f596e576-84ee-4cf9-9a96-c342d16ece3b\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.653195 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-bundle\") pod \"f596e576-84ee-4cf9-9a96-c342d16ece3b\" (UID: \"f596e576-84ee-4cf9-9a96-c342d16ece3b\") " Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.656329 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-bundle" (OuterVolumeSpecName: "bundle") pod "f596e576-84ee-4cf9-9a96-c342d16ece3b" (UID: "f596e576-84ee-4cf9-9a96-c342d16ece3b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.679079 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f596e576-84ee-4cf9-9a96-c342d16ece3b-kube-api-access-wjzgz" (OuterVolumeSpecName: "kube-api-access-wjzgz") pod "f596e576-84ee-4cf9-9a96-c342d16ece3b" (UID: "f596e576-84ee-4cf9-9a96-c342d16ece3b"). InnerVolumeSpecName "kube-api-access-wjzgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.696810 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-util" (OuterVolumeSpecName: "util") pod "f596e576-84ee-4cf9-9a96-c342d16ece3b" (UID: "f596e576-84ee-4cf9-9a96-c342d16ece3b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.755186 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.755224 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f596e576-84ee-4cf9-9a96-c342d16ece3b-util\") on node \"crc\" DevicePath \"\"" Oct 06 15:07:46 crc kubenswrapper[4763]: I1006 15:07:46.755234 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjzgz\" (UniqueName: \"kubernetes.io/projected/f596e576-84ee-4cf9-9a96-c342d16ece3b-kube-api-access-wjzgz\") on node \"crc\" DevicePath \"\"" Oct 06 15:07:47 crc kubenswrapper[4763]: I1006 15:07:47.289512 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" event={"ID":"f596e576-84ee-4cf9-9a96-c342d16ece3b","Type":"ContainerDied","Data":"97d539c06125d24c6e05587068fbc8e2ac71b01c2fbab0fd5f15b15b7e20a898"} Oct 06 15:07:47 crc kubenswrapper[4763]: I1006 15:07:47.289582 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97d539c06125d24c6e05587068fbc8e2ac71b01c2fbab0fd5f15b15b7e20a898" Oct 06 15:07:47 crc kubenswrapper[4763]: I1006 15:07:47.289589 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp" Oct 06 15:07:47 crc kubenswrapper[4763]: I1006 15:07:47.588399 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed48753b-ca22-4e34-904d-dcc14936fd87" path="/var/lib/kubelet/pods/ed48753b-ca22-4e34-904d-dcc14936fd87/volumes" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:50.995593 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62"] Oct 06 15:07:51 crc kubenswrapper[4763]: E1006 15:07:51.000464 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f596e576-84ee-4cf9-9a96-c342d16ece3b" containerName="util" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.000484 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f596e576-84ee-4cf9-9a96-c342d16ece3b" containerName="util" Oct 06 15:07:51 crc kubenswrapper[4763]: E1006 15:07:51.000497 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerName="extract-content" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.000503 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerName="extract-content" Oct 06 15:07:51 crc kubenswrapper[4763]: E1006 15:07:51.000511 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerName="extract-utilities" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.000518 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerName="extract-utilities" Oct 06 15:07:51 crc kubenswrapper[4763]: E1006 15:07:51.000534 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerName="registry-server" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.000540 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerName="registry-server" Oct 06 15:07:51 crc kubenswrapper[4763]: E1006 15:07:51.000548 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f596e576-84ee-4cf9-9a96-c342d16ece3b" containerName="extract" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.000553 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f596e576-84ee-4cf9-9a96-c342d16ece3b" containerName="extract" Oct 06 15:07:51 crc kubenswrapper[4763]: E1006 15:07:51.000563 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f596e576-84ee-4cf9-9a96-c342d16ece3b" containerName="pull" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.000569 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f596e576-84ee-4cf9-9a96-c342d16ece3b" containerName="pull" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.000691 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed48753b-ca22-4e34-904d-dcc14936fd87" containerName="registry-server" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.000704 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f596e576-84ee-4cf9-9a96-c342d16ece3b" containerName="extract" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.001157 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.008156 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.008875 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-ctv8s" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.009032 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.012141 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrn5h\" (UniqueName: \"kubernetes.io/projected/eaf15887-6248-4ec3-a8c2-0d2a2073e115-kube-api-access-qrn5h\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2hp62\" (UID: \"eaf15887-6248-4ec3-a8c2-0d2a2073e115\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.012572 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62"] Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.113302 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrn5h\" (UniqueName: \"kubernetes.io/projected/eaf15887-6248-4ec3-a8c2-0d2a2073e115-kube-api-access-qrn5h\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2hp62\" (UID: \"eaf15887-6248-4ec3-a8c2-0d2a2073e115\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.144749 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrn5h\" (UniqueName: \"kubernetes.io/projected/eaf15887-6248-4ec3-a8c2-0d2a2073e115-kube-api-access-qrn5h\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2hp62\" (UID: \"eaf15887-6248-4ec3-a8c2-0d2a2073e115\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.320679 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62" Oct 06 15:07:51 crc kubenswrapper[4763]: I1006 15:07:51.613195 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62"] Oct 06 15:07:51 crc kubenswrapper[4763]: W1006 15:07:51.630894 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf15887_6248_4ec3_a8c2_0d2a2073e115.slice/crio-14a7f6681b37908686e18601d5410c585c5805e2150bb96456b95113181fa28a WatchSource:0}: Error finding container 14a7f6681b37908686e18601d5410c585c5805e2150bb96456b95113181fa28a: Status 404 returned error can't find the container with id 14a7f6681b37908686e18601d5410c585c5805e2150bb96456b95113181fa28a Oct 06 15:07:52 crc kubenswrapper[4763]: I1006 15:07:52.319944 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62" event={"ID":"eaf15887-6248-4ec3-a8c2-0d2a2073e115","Type":"ContainerStarted","Data":"14a7f6681b37908686e18601d5410c585c5805e2150bb96456b95113181fa28a"} Oct 06 15:07:55 crc kubenswrapper[4763]: I1006 15:07:55.554760 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zgs9f" Oct 06 15:07:59 crc kubenswrapper[4763]: I1006 15:07:59.360507 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62" event={"ID":"eaf15887-6248-4ec3-a8c2-0d2a2073e115","Type":"ContainerStarted","Data":"fb549fbc6221266148c80c8bee621e8102ae789573ebe225c061fedacfe169e6"} Oct 06 15:07:59 crc kubenswrapper[4763]: I1006 15:07:59.389590 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2hp62" podStartSLOduration=2.204618471 podStartE2EDuration="9.389569011s" podCreationTimestamp="2025-10-06 15:07:50 +0000 UTC" firstStartedPulling="2025-10-06 15:07:51.63688338 +0000 UTC m=+868.792175892" lastFinishedPulling="2025-10-06 15:07:58.82183392 +0000 UTC m=+875.977126432" observedRunningTime="2025-10-06 15:07:59.387414561 +0000 UTC m=+876.542707093" watchObservedRunningTime="2025-10-06 15:07:59.389569011 +0000 UTC m=+876.544861523" Oct 06 15:08:02 crc kubenswrapper[4763]: I1006 15:08:02.999559 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-rwjdz"] Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.000938 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.005055 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.005198 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-97hw6" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.005319 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.008221 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-rwjdz"] Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.182603 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20b26141-5d1b-4c8f-97fb-626f669768b1-bound-sa-token\") pod \"cert-manager-webhook-d969966f-rwjdz\" (UID: \"20b26141-5d1b-4c8f-97fb-626f669768b1\") " pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.183052 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7k9k\" (UniqueName: \"kubernetes.io/projected/20b26141-5d1b-4c8f-97fb-626f669768b1-kube-api-access-q7k9k\") pod \"cert-manager-webhook-d969966f-rwjdz\" (UID: \"20b26141-5d1b-4c8f-97fb-626f669768b1\") " pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.284877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20b26141-5d1b-4c8f-97fb-626f669768b1-bound-sa-token\") pod \"cert-manager-webhook-d969966f-rwjdz\" (UID: \"20b26141-5d1b-4c8f-97fb-626f669768b1\") " pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.284956 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7k9k\" (UniqueName: \"kubernetes.io/projected/20b26141-5d1b-4c8f-97fb-626f669768b1-kube-api-access-q7k9k\") pod \"cert-manager-webhook-d969966f-rwjdz\" (UID: \"20b26141-5d1b-4c8f-97fb-626f669768b1\") " pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.307729 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20b26141-5d1b-4c8f-97fb-626f669768b1-bound-sa-token\") pod \"cert-manager-webhook-d969966f-rwjdz\" (UID: \"20b26141-5d1b-4c8f-97fb-626f669768b1\") " pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.307958 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7k9k\" (UniqueName: \"kubernetes.io/projected/20b26141-5d1b-4c8f-97fb-626f669768b1-kube-api-access-q7k9k\") pod \"cert-manager-webhook-d969966f-rwjdz\" (UID: \"20b26141-5d1b-4c8f-97fb-626f669768b1\") " pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.320342 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.553675 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-rwjdz"] Oct 06 15:08:03 crc kubenswrapper[4763]: W1006 15:08:03.565234 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20b26141_5d1b_4c8f_97fb_626f669768b1.slice/crio-ef7f92bc871ded8974d3f5f667093062b6e4e36b64ed80eb2061a2a6fef703f7 WatchSource:0}: Error finding container ef7f92bc871ded8974d3f5f667093062b6e4e36b64ed80eb2061a2a6fef703f7: Status 404 returned error can't find the container with id ef7f92bc871ded8974d3f5f667093062b6e4e36b64ed80eb2061a2a6fef703f7 Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.774939 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr"] Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.775572 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.778137 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-cqbjb" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.792200 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr"] Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.895434 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/065553f7-5ee3-4c9f-ac08-a22904cfd751-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-vg7rr\" (UID: \"065553f7-5ee3-4c9f-ac08-a22904cfd751\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.895525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g9pp\" (UniqueName: \"kubernetes.io/projected/065553f7-5ee3-4c9f-ac08-a22904cfd751-kube-api-access-7g9pp\") pod \"cert-manager-cainjector-7d9f95dbf-vg7rr\" (UID: \"065553f7-5ee3-4c9f-ac08-a22904cfd751\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.996304 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/065553f7-5ee3-4c9f-ac08-a22904cfd751-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-vg7rr\" (UID: \"065553f7-5ee3-4c9f-ac08-a22904cfd751\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" Oct 06 15:08:03 crc kubenswrapper[4763]: I1006 15:08:03.996370 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g9pp\" (UniqueName: \"kubernetes.io/projected/065553f7-5ee3-4c9f-ac08-a22904cfd751-kube-api-access-7g9pp\") pod \"cert-manager-cainjector-7d9f95dbf-vg7rr\" (UID: \"065553f7-5ee3-4c9f-ac08-a22904cfd751\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" Oct 06 15:08:04 crc kubenswrapper[4763]: I1006 15:08:04.012424 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/065553f7-5ee3-4c9f-ac08-a22904cfd751-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-vg7rr\" (UID: \"065553f7-5ee3-4c9f-ac08-a22904cfd751\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" Oct 06 15:08:04 crc kubenswrapper[4763]: I1006 15:08:04.012675 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g9pp\" (UniqueName: \"kubernetes.io/projected/065553f7-5ee3-4c9f-ac08-a22904cfd751-kube-api-access-7g9pp\") pod \"cert-manager-cainjector-7d9f95dbf-vg7rr\" (UID: \"065553f7-5ee3-4c9f-ac08-a22904cfd751\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" Oct 06 15:08:04 crc kubenswrapper[4763]: I1006 15:08:04.119184 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" Oct 06 15:08:04 crc kubenswrapper[4763]: I1006 15:08:04.398925 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" event={"ID":"20b26141-5d1b-4c8f-97fb-626f669768b1","Type":"ContainerStarted","Data":"ef7f92bc871ded8974d3f5f667093062b6e4e36b64ed80eb2061a2a6fef703f7"} Oct 06 15:08:04 crc kubenswrapper[4763]: I1006 15:08:04.501477 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr"] Oct 06 15:08:04 crc kubenswrapper[4763]: W1006 15:08:04.509720 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod065553f7_5ee3_4c9f_ac08_a22904cfd751.slice/crio-39ddd2a8f06f01e379c8910acf34b209189bd125976028d90e3391582e524552 WatchSource:0}: Error finding container 39ddd2a8f06f01e379c8910acf34b209189bd125976028d90e3391582e524552: Status 404 returned error can't find the container with id 39ddd2a8f06f01e379c8910acf34b209189bd125976028d90e3391582e524552 Oct 06 15:08:05 crc kubenswrapper[4763]: I1006 15:08:05.404144 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" event={"ID":"065553f7-5ee3-4c9f-ac08-a22904cfd751","Type":"ContainerStarted","Data":"39ddd2a8f06f01e379c8910acf34b209189bd125976028d90e3391582e524552"} Oct 06 15:08:08 crc kubenswrapper[4763]: I1006 15:08:08.423137 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" event={"ID":"20b26141-5d1b-4c8f-97fb-626f669768b1","Type":"ContainerStarted","Data":"017bbbe8285f1cc51bae00d9acd73e2fa84797f1790448a2bf2057bee2b59810"} Oct 06 15:08:08 crc kubenswrapper[4763]: I1006 15:08:08.423590 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" Oct 06 15:08:08 crc kubenswrapper[4763]: I1006 15:08:08.427143 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" event={"ID":"065553f7-5ee3-4c9f-ac08-a22904cfd751","Type":"ContainerStarted","Data":"42100aca514d8fd3deedbaa11d6636b5a146f1ebe6cf3944ba17633482acf512"} Oct 06 15:08:08 crc kubenswrapper[4763]: I1006 15:08:08.440281 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" podStartSLOduration=1.8982676029999999 podStartE2EDuration="6.440265052s" podCreationTimestamp="2025-10-06 15:08:02 +0000 UTC" firstStartedPulling="2025-10-06 15:08:03.568747041 +0000 UTC m=+880.724039573" lastFinishedPulling="2025-10-06 15:08:08.11074451 +0000 UTC m=+885.266037022" observedRunningTime="2025-10-06 15:08:08.437405892 +0000 UTC m=+885.592698494" watchObservedRunningTime="2025-10-06 15:08:08.440265052 +0000 UTC m=+885.595557564" Oct 06 15:08:08 crc kubenswrapper[4763]: I1006 15:08:08.454524 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-vg7rr" podStartSLOduration=1.842515825 podStartE2EDuration="5.454508948s" podCreationTimestamp="2025-10-06 15:08:03 +0000 UTC" firstStartedPulling="2025-10-06 15:08:04.5135939 +0000 UTC m=+881.668886412" lastFinishedPulling="2025-10-06 15:08:08.125587023 +0000 UTC m=+885.280879535" observedRunningTime="2025-10-06 15:08:08.453792258 +0000 UTC m=+885.609084780" watchObservedRunningTime="2025-10-06 15:08:08.454508948 +0000 UTC m=+885.609801450" Oct 06 15:08:13 crc kubenswrapper[4763]: I1006 15:08:13.324469 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-rwjdz" Oct 06 15:08:20 crc kubenswrapper[4763]: I1006 15:08:20.949996 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-zm2kr"] Oct 06 15:08:20 crc kubenswrapper[4763]: I1006 15:08:20.952197 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" Oct 06 15:08:20 crc kubenswrapper[4763]: I1006 15:08:20.964908 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xb6d9" Oct 06 15:08:20 crc kubenswrapper[4763]: I1006 15:08:20.965876 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-zm2kr"] Oct 06 15:08:21 crc kubenswrapper[4763]: I1006 15:08:21.050760 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f65abbd1-72e7-4956-b93a-10f9957210fe-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-zm2kr\" (UID: \"f65abbd1-72e7-4956-b93a-10f9957210fe\") " pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" Oct 06 15:08:21 crc kubenswrapper[4763]: I1006 15:08:21.050875 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4czw\" (UniqueName: \"kubernetes.io/projected/f65abbd1-72e7-4956-b93a-10f9957210fe-kube-api-access-k4czw\") pod \"cert-manager-7d4cc89fcb-zm2kr\" (UID: \"f65abbd1-72e7-4956-b93a-10f9957210fe\") " pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" Oct 06 15:08:21 crc kubenswrapper[4763]: I1006 15:08:21.151857 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f65abbd1-72e7-4956-b93a-10f9957210fe-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-zm2kr\" (UID: \"f65abbd1-72e7-4956-b93a-10f9957210fe\") " pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" Oct 06 15:08:21 crc kubenswrapper[4763]: I1006 15:08:21.151991 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4czw\" (UniqueName: \"kubernetes.io/projected/f65abbd1-72e7-4956-b93a-10f9957210fe-kube-api-access-k4czw\") pod \"cert-manager-7d4cc89fcb-zm2kr\" (UID: \"f65abbd1-72e7-4956-b93a-10f9957210fe\") " pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" Oct 06 15:08:21 crc kubenswrapper[4763]: I1006 15:08:21.178819 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4czw\" (UniqueName: \"kubernetes.io/projected/f65abbd1-72e7-4956-b93a-10f9957210fe-kube-api-access-k4czw\") pod \"cert-manager-7d4cc89fcb-zm2kr\" (UID: \"f65abbd1-72e7-4956-b93a-10f9957210fe\") " pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" Oct 06 15:08:21 crc kubenswrapper[4763]: I1006 15:08:21.180931 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f65abbd1-72e7-4956-b93a-10f9957210fe-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-zm2kr\" (UID: \"f65abbd1-72e7-4956-b93a-10f9957210fe\") " pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" Oct 06 15:08:21 crc kubenswrapper[4763]: I1006 15:08:21.293065 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" Oct 06 15:08:21 crc kubenswrapper[4763]: I1006 15:08:21.771136 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-zm2kr"] Oct 06 15:08:21 crc kubenswrapper[4763]: W1006 15:08:21.775810 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf65abbd1_72e7_4956_b93a_10f9957210fe.slice/crio-1a532d784642496d92d82d9b72a7507cbfd60c626b2cbdc3a864f6585a7dfcc0 WatchSource:0}: Error finding container 1a532d784642496d92d82d9b72a7507cbfd60c626b2cbdc3a864f6585a7dfcc0: Status 404 returned error can't find the container with id 1a532d784642496d92d82d9b72a7507cbfd60c626b2cbdc3a864f6585a7dfcc0 Oct 06 15:08:22 crc kubenswrapper[4763]: I1006 15:08:22.547161 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" event={"ID":"f65abbd1-72e7-4956-b93a-10f9957210fe","Type":"ContainerStarted","Data":"903a790d124fb36e2e4f88768f8b9326e671924354ff270cdda9544895d2cca0"} Oct 06 15:08:22 crc kubenswrapper[4763]: I1006 15:08:22.547555 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" event={"ID":"f65abbd1-72e7-4956-b93a-10f9957210fe","Type":"ContainerStarted","Data":"1a532d784642496d92d82d9b72a7507cbfd60c626b2cbdc3a864f6585a7dfcc0"} Oct 06 15:08:22 crc kubenswrapper[4763]: I1006 15:08:22.581344 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-zm2kr" podStartSLOduration=2.581320913 podStartE2EDuration="2.581320913s" podCreationTimestamp="2025-10-06 15:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:08:22.576582231 +0000 UTC m=+899.731874763" watchObservedRunningTime="2025-10-06 15:08:22.581320913 +0000 UTC m=+899.736613435" Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.274061 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6qtzn"] Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.275525 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6qtzn" Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.277770 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.277871 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.277898 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-575dv" Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.286661 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6qtzn"] Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.447611 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxcnn\" (UniqueName: \"kubernetes.io/projected/cc23101a-9ccd-42c8-bcbc-3a510205301e-kube-api-access-zxcnn\") pod \"openstack-operator-index-6qtzn\" (UID: \"cc23101a-9ccd-42c8-bcbc-3a510205301e\") " pod="openstack-operators/openstack-operator-index-6qtzn" Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.549752 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxcnn\" (UniqueName: \"kubernetes.io/projected/cc23101a-9ccd-42c8-bcbc-3a510205301e-kube-api-access-zxcnn\") pod \"openstack-operator-index-6qtzn\" (UID: \"cc23101a-9ccd-42c8-bcbc-3a510205301e\") " pod="openstack-operators/openstack-operator-index-6qtzn" Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.581587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxcnn\" (UniqueName: \"kubernetes.io/projected/cc23101a-9ccd-42c8-bcbc-3a510205301e-kube-api-access-zxcnn\") pod \"openstack-operator-index-6qtzn\" (UID: \"cc23101a-9ccd-42c8-bcbc-3a510205301e\") " pod="openstack-operators/openstack-operator-index-6qtzn" Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.596070 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6qtzn" Oct 06 15:08:27 crc kubenswrapper[4763]: I1006 15:08:27.813042 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6qtzn"] Oct 06 15:08:27 crc kubenswrapper[4763]: W1006 15:08:27.823558 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc23101a_9ccd_42c8_bcbc_3a510205301e.slice/crio-46e8de262f2b65fbc07a6f435838ac35f88b9a992dd9357dbc216670ef9efb1e WatchSource:0}: Error finding container 46e8de262f2b65fbc07a6f435838ac35f88b9a992dd9357dbc216670ef9efb1e: Status 404 returned error can't find the container with id 46e8de262f2b65fbc07a6f435838ac35f88b9a992dd9357dbc216670ef9efb1e Oct 06 15:08:28 crc kubenswrapper[4763]: I1006 15:08:28.596867 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6qtzn" event={"ID":"cc23101a-9ccd-42c8-bcbc-3a510205301e","Type":"ContainerStarted","Data":"46e8de262f2b65fbc07a6f435838ac35f88b9a992dd9357dbc216670ef9efb1e"} Oct 06 15:08:32 crc kubenswrapper[4763]: I1006 15:08:32.627145 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6qtzn" event={"ID":"cc23101a-9ccd-42c8-bcbc-3a510205301e","Type":"ContainerStarted","Data":"440f53a99eeb50980a6861f19244ea102eef5db4e1d53ac7c86fdf6e0d4c6daa"} Oct 06 15:08:32 crc kubenswrapper[4763]: I1006 15:08:32.647262 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6qtzn" podStartSLOduration=1.063229788 podStartE2EDuration="5.647245897s" podCreationTimestamp="2025-10-06 15:08:27 +0000 UTC" firstStartedPulling="2025-10-06 15:08:27.825808973 +0000 UTC m=+904.981101485" lastFinishedPulling="2025-10-06 15:08:32.409825082 +0000 UTC m=+909.565117594" observedRunningTime="2025-10-06 15:08:32.645172197 +0000 UTC m=+909.800464739" watchObservedRunningTime="2025-10-06 15:08:32.647245897 +0000 UTC m=+909.802538409" Oct 06 15:08:37 crc kubenswrapper[4763]: I1006 15:08:37.596839 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-6qtzn" Oct 06 15:08:37 crc kubenswrapper[4763]: I1006 15:08:37.597221 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-6qtzn" Oct 06 15:08:37 crc kubenswrapper[4763]: I1006 15:08:37.630007 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-6qtzn" Oct 06 15:08:37 crc kubenswrapper[4763]: I1006 15:08:37.691847 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-6qtzn" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.607134 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs"] Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.609905 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.612706 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-pzx4k" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.614186 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs"] Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.814119 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-bundle\") pod \"0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.814308 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pq8g\" (UniqueName: \"kubernetes.io/projected/6e112159-8d72-4a19-9162-619d8f9bfa45-kube-api-access-5pq8g\") pod \"0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.814464 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-util\") pod \"0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.915668 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-bundle\") pod \"0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.915857 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pq8g\" (UniqueName: \"kubernetes.io/projected/6e112159-8d72-4a19-9162-619d8f9bfa45-kube-api-access-5pq8g\") pod \"0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.915912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-util\") pod \"0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.916473 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-bundle\") pod \"0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.916870 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-util\") pod \"0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:44 crc kubenswrapper[4763]: I1006 15:08:44.947016 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pq8g\" (UniqueName: \"kubernetes.io/projected/6e112159-8d72-4a19-9162-619d8f9bfa45-kube-api-access-5pq8g\") pod \"0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:45 crc kubenswrapper[4763]: I1006 15:08:45.235836 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:45 crc kubenswrapper[4763]: I1006 15:08:45.490654 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs"] Oct 06 15:08:45 crc kubenswrapper[4763]: W1006 15:08:45.501750 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e112159_8d72_4a19_9162_619d8f9bfa45.slice/crio-3ac77f65929daf711e8c5946fc8b202133b584f901506c3bf090626f259fa942 WatchSource:0}: Error finding container 3ac77f65929daf711e8c5946fc8b202133b584f901506c3bf090626f259fa942: Status 404 returned error can't find the container with id 3ac77f65929daf711e8c5946fc8b202133b584f901506c3bf090626f259fa942 Oct 06 15:08:45 crc kubenswrapper[4763]: I1006 15:08:45.738047 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" event={"ID":"6e112159-8d72-4a19-9162-619d8f9bfa45","Type":"ContainerStarted","Data":"94311810726b885a839e0afdc3d92eb39a792c0bbefb6d9f872667921076e8db"} Oct 06 15:08:45 crc kubenswrapper[4763]: I1006 15:08:45.738343 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" event={"ID":"6e112159-8d72-4a19-9162-619d8f9bfa45","Type":"ContainerStarted","Data":"3ac77f65929daf711e8c5946fc8b202133b584f901506c3bf090626f259fa942"} Oct 06 15:08:46 crc kubenswrapper[4763]: I1006 15:08:46.750747 4763 generic.go:334] "Generic (PLEG): container finished" podID="6e112159-8d72-4a19-9162-619d8f9bfa45" containerID="94311810726b885a839e0afdc3d92eb39a792c0bbefb6d9f872667921076e8db" exitCode=0 Oct 06 15:08:46 crc kubenswrapper[4763]: I1006 15:08:46.751154 4763 generic.go:334] "Generic (PLEG): container finished" podID="6e112159-8d72-4a19-9162-619d8f9bfa45" containerID="459b64a4040bd9be0c1a48a9f2af06c56a5d81f38ff3d725184b7e65021e5927" exitCode=0 Oct 06 15:08:46 crc kubenswrapper[4763]: I1006 15:08:46.750864 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" event={"ID":"6e112159-8d72-4a19-9162-619d8f9bfa45","Type":"ContainerDied","Data":"94311810726b885a839e0afdc3d92eb39a792c0bbefb6d9f872667921076e8db"} Oct 06 15:08:46 crc kubenswrapper[4763]: I1006 15:08:46.751214 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" event={"ID":"6e112159-8d72-4a19-9162-619d8f9bfa45","Type":"ContainerDied","Data":"459b64a4040bd9be0c1a48a9f2af06c56a5d81f38ff3d725184b7e65021e5927"} Oct 06 15:08:47 crc kubenswrapper[4763]: I1006 15:08:47.764545 4763 generic.go:334] "Generic (PLEG): container finished" podID="6e112159-8d72-4a19-9162-619d8f9bfa45" containerID="2cf22f8efd9872756dbc1b0aa266dbb9454735d6583b004ff09adc8822450eb9" exitCode=0 Oct 06 15:08:47 crc kubenswrapper[4763]: I1006 15:08:47.764654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" event={"ID":"6e112159-8d72-4a19-9162-619d8f9bfa45","Type":"ContainerDied","Data":"2cf22f8efd9872756dbc1b0aa266dbb9454735d6583b004ff09adc8822450eb9"} Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.117364 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.274656 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-bundle\") pod \"6e112159-8d72-4a19-9162-619d8f9bfa45\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.274758 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pq8g\" (UniqueName: \"kubernetes.io/projected/6e112159-8d72-4a19-9162-619d8f9bfa45-kube-api-access-5pq8g\") pod \"6e112159-8d72-4a19-9162-619d8f9bfa45\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.274900 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-util\") pod \"6e112159-8d72-4a19-9162-619d8f9bfa45\" (UID: \"6e112159-8d72-4a19-9162-619d8f9bfa45\") " Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.276604 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-bundle" (OuterVolumeSpecName: "bundle") pod "6e112159-8d72-4a19-9162-619d8f9bfa45" (UID: "6e112159-8d72-4a19-9162-619d8f9bfa45"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.283533 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e112159-8d72-4a19-9162-619d8f9bfa45-kube-api-access-5pq8g" (OuterVolumeSpecName: "kube-api-access-5pq8g") pod "6e112159-8d72-4a19-9162-619d8f9bfa45" (UID: "6e112159-8d72-4a19-9162-619d8f9bfa45"). InnerVolumeSpecName "kube-api-access-5pq8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.306038 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-util" (OuterVolumeSpecName: "util") pod "6e112159-8d72-4a19-9162-619d8f9bfa45" (UID: "6e112159-8d72-4a19-9162-619d8f9bfa45"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.376996 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.377053 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pq8g\" (UniqueName: \"kubernetes.io/projected/6e112159-8d72-4a19-9162-619d8f9bfa45-kube-api-access-5pq8g\") on node \"crc\" DevicePath \"\"" Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.377082 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e112159-8d72-4a19-9162-619d8f9bfa45-util\") on node \"crc\" DevicePath \"\"" Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.787308 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" event={"ID":"6e112159-8d72-4a19-9162-619d8f9bfa45","Type":"ContainerDied","Data":"3ac77f65929daf711e8c5946fc8b202133b584f901506c3bf090626f259fa942"} Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.787363 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ac77f65929daf711e8c5946fc8b202133b584f901506c3bf090626f259fa942" Oct 06 15:08:49 crc kubenswrapper[4763]: I1006 15:08:49.787392 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.467282 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd"] Oct 06 15:08:57 crc kubenswrapper[4763]: E1006 15:08:57.468102 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e112159-8d72-4a19-9162-619d8f9bfa45" containerName="pull" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.468117 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e112159-8d72-4a19-9162-619d8f9bfa45" containerName="pull" Oct 06 15:08:57 crc kubenswrapper[4763]: E1006 15:08:57.468136 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e112159-8d72-4a19-9162-619d8f9bfa45" containerName="extract" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.468144 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e112159-8d72-4a19-9162-619d8f9bfa45" containerName="extract" Oct 06 15:08:57 crc kubenswrapper[4763]: E1006 15:08:57.468155 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e112159-8d72-4a19-9162-619d8f9bfa45" containerName="util" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.468163 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e112159-8d72-4a19-9162-619d8f9bfa45" containerName="util" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.468297 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e112159-8d72-4a19-9162-619d8f9bfa45" containerName="extract" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.468998 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.471929 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-lhblq" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.479997 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms6vv\" (UniqueName: \"kubernetes.io/projected/ff1c32b8-62e7-4e36-ba7f-39fe0d187db3-kube-api-access-ms6vv\") pod \"openstack-operator-controller-operator-6bbd86684c-ss5vd\" (UID: \"ff1c32b8-62e7-4e36-ba7f-39fe0d187db3\") " pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.485369 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd"] Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.581369 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms6vv\" (UniqueName: \"kubernetes.io/projected/ff1c32b8-62e7-4e36-ba7f-39fe0d187db3-kube-api-access-ms6vv\") pod \"openstack-operator-controller-operator-6bbd86684c-ss5vd\" (UID: \"ff1c32b8-62e7-4e36-ba7f-39fe0d187db3\") " pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.604215 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms6vv\" (UniqueName: \"kubernetes.io/projected/ff1c32b8-62e7-4e36-ba7f-39fe0d187db3-kube-api-access-ms6vv\") pod \"openstack-operator-controller-operator-6bbd86684c-ss5vd\" (UID: \"ff1c32b8-62e7-4e36-ba7f-39fe0d187db3\") " pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" Oct 06 15:08:57 crc kubenswrapper[4763]: I1006 15:08:57.786120 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" Oct 06 15:08:58 crc kubenswrapper[4763]: I1006 15:08:58.195707 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd"] Oct 06 15:08:58 crc kubenswrapper[4763]: I1006 15:08:58.849714 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" event={"ID":"ff1c32b8-62e7-4e36-ba7f-39fe0d187db3","Type":"ContainerStarted","Data":"73dea714290c42cee628bdf9c9d21c14375f66de72a250a79c88d5b0e7990361"} Oct 06 15:09:02 crc kubenswrapper[4763]: I1006 15:09:02.881229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" event={"ID":"ff1c32b8-62e7-4e36-ba7f-39fe0d187db3","Type":"ContainerStarted","Data":"908a232fab0e95b3a04f07bc92da8aa0df65d54be1f9e528147a6f834e5bcf8c"} Oct 06 15:09:05 crc kubenswrapper[4763]: I1006 15:09:05.914156 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" event={"ID":"ff1c32b8-62e7-4e36-ba7f-39fe0d187db3","Type":"ContainerStarted","Data":"758187a71346c6ef966f4f9618d9fbb9d3560ea9d3c3832cf01d60a4bb540f3e"} Oct 06 15:09:05 crc kubenswrapper[4763]: I1006 15:09:05.914677 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" Oct 06 15:09:05 crc kubenswrapper[4763]: I1006 15:09:05.962881 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" podStartSLOduration=2.434076861 podStartE2EDuration="8.962856202s" podCreationTimestamp="2025-10-06 15:08:57 +0000 UTC" firstStartedPulling="2025-10-06 15:08:58.203383046 +0000 UTC m=+935.358675558" lastFinishedPulling="2025-10-06 15:09:04.732162387 +0000 UTC m=+941.887454899" observedRunningTime="2025-10-06 15:09:05.956239855 +0000 UTC m=+943.111532397" watchObservedRunningTime="2025-10-06 15:09:05.962856202 +0000 UTC m=+943.118148724" Oct 06 15:09:07 crc kubenswrapper[4763]: I1006 15:09:07.788590 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6bbd86684c-ss5vd" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.337914 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.339599 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.341414 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-x8z9k" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.349665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tww7x\" (UniqueName: \"kubernetes.io/projected/dd4eb024-6f04-4ee5-a485-78311ddee488-kube-api-access-tww7x\") pod \"barbican-operator-controller-manager-58c4cd55f4-ldl9r\" (UID: \"dd4eb024-6f04-4ee5-a485-78311ddee488\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.352739 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.354050 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.360190 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-pjf47" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.365892 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.421071 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.427088 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.428327 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.433083 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zpznm" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.436732 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.437833 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.448856 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.450874 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrdwk\" (UniqueName: \"kubernetes.io/projected/f76d9cdd-3955-4727-87b4-20bf248be0f2-kube-api-access-zrdwk\") pod \"cinder-operator-controller-manager-7d4d4f8d-ctdbs\" (UID: \"f76d9cdd-3955-4727-87b4-20bf248be0f2\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.450952 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5qv7\" (UniqueName: \"kubernetes.io/projected/64a78de0-4ad4-40f2-bf1c-a830f4c32dd1-kube-api-access-f5qv7\") pod \"glance-operator-controller-manager-5dc44df7d5-cwcd8\" (UID: \"64a78de0-4ad4-40f2-bf1c-a830f4c32dd1\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.450996 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tww7x\" (UniqueName: \"kubernetes.io/projected/dd4eb024-6f04-4ee5-a485-78311ddee488-kube-api-access-tww7x\") pod \"barbican-operator-controller-manager-58c4cd55f4-ldl9r\" (UID: \"dd4eb024-6f04-4ee5-a485-78311ddee488\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.451027 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql967\" (UniqueName: \"kubernetes.io/projected/2c0fa69d-0c27-48bc-b120-e1416445de40-kube-api-access-ql967\") pod \"designate-operator-controller-manager-75dfd9b554-2k5df\" (UID: \"2c0fa69d-0c27-48bc-b120-e1416445de40\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.453836 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rdbt6" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.453943 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.460104 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-pc484"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.461811 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.463435 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-j72bj" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.483675 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-pc484"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.496876 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.498192 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.510842 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-625x7"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.515329 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.515641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tww7x\" (UniqueName: \"kubernetes.io/projected/dd4eb024-6f04-4ee5-a485-78311ddee488-kube-api-access-tww7x\") pod \"barbican-operator-controller-manager-58c4cd55f4-ldl9r\" (UID: \"dd4eb024-6f04-4ee5-a485-78311ddee488\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.515858 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-zrmbd" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.520727 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.528009 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-x25mk" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.528504 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.548820 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.550031 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.557198 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rm6sk" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.558360 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf9j2\" (UniqueName: \"kubernetes.io/projected/1e501127-6403-4475-896e-52efee97894e-kube-api-access-sf9j2\") pod \"infra-operator-controller-manager-658588b8c9-625x7\" (UID: \"1e501127-6403-4475-896e-52efee97894e\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.558411 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdwk\" (UniqueName: \"kubernetes.io/projected/f76d9cdd-3955-4727-87b4-20bf248be0f2-kube-api-access-zrdwk\") pod \"cinder-operator-controller-manager-7d4d4f8d-ctdbs\" (UID: \"f76d9cdd-3955-4727-87b4-20bf248be0f2\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.558434 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4jww\" (UniqueName: \"kubernetes.io/projected/97d973de-bf67-44c5-b76c-dcb36cab65b4-kube-api-access-b4jww\") pod \"ironic-operator-controller-manager-649675d675-x7zlp\" (UID: \"97d973de-bf67-44c5-b76c-dcb36cab65b4\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.558460 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548zl\" (UniqueName: \"kubernetes.io/projected/4a94b2e1-c815-45d3-b7ac-aa7b632af0ef-kube-api-access-548zl\") pod \"heat-operator-controller-manager-54b4974c45-pc484\" (UID: \"4a94b2e1-c815-45d3-b7ac-aa7b632af0ef\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.558486 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5qv7\" (UniqueName: \"kubernetes.io/projected/64a78de0-4ad4-40f2-bf1c-a830f4c32dd1-kube-api-access-f5qv7\") pod \"glance-operator-controller-manager-5dc44df7d5-cwcd8\" (UID: \"64a78de0-4ad4-40f2-bf1c-a830f4c32dd1\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.558521 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql967\" (UniqueName: \"kubernetes.io/projected/2c0fa69d-0c27-48bc-b120-e1416445de40-kube-api-access-ql967\") pod \"designate-operator-controller-manager-75dfd9b554-2k5df\" (UID: \"2c0fa69d-0c27-48bc-b120-e1416445de40\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.558538 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjwq9\" (UniqueName: \"kubernetes.io/projected/e5c474ec-88f8-4c60-a866-fcc48cf5bcec-kube-api-access-pjwq9\") pod \"horizon-operator-controller-manager-76d5b87f47-s54qg\" (UID: \"e5c474ec-88f8-4c60-a866-fcc48cf5bcec\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.558567 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e501127-6403-4475-896e-52efee97894e-cert\") pod \"infra-operator-controller-manager-658588b8c9-625x7\" (UID: \"1e501127-6403-4475-896e-52efee97894e\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.559064 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.559994 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.561137 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2qmjx" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.574363 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-625x7"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.583588 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5qv7\" (UniqueName: \"kubernetes.io/projected/64a78de0-4ad4-40f2-bf1c-a830f4c32dd1-kube-api-access-f5qv7\") pod \"glance-operator-controller-manager-5dc44df7d5-cwcd8\" (UID: \"64a78de0-4ad4-40f2-bf1c-a830f4c32dd1\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.589313 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdwk\" (UniqueName: \"kubernetes.io/projected/f76d9cdd-3955-4727-87b4-20bf248be0f2-kube-api-access-zrdwk\") pod \"cinder-operator-controller-manager-7d4d4f8d-ctdbs\" (UID: \"f76d9cdd-3955-4727-87b4-20bf248be0f2\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.600241 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.608594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql967\" (UniqueName: \"kubernetes.io/projected/2c0fa69d-0c27-48bc-b120-e1416445de40-kube-api-access-ql967\") pod \"designate-operator-controller-manager-75dfd9b554-2k5df\" (UID: \"2c0fa69d-0c27-48bc-b120-e1416445de40\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.623723 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.624748 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.628255 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-sgbxg" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.659352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e501127-6403-4475-896e-52efee97894e-cert\") pod \"infra-operator-controller-manager-658588b8c9-625x7\" (UID: \"1e501127-6403-4475-896e-52efee97894e\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.659417 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.659484 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf9j2\" (UniqueName: \"kubernetes.io/projected/1e501127-6403-4475-896e-52efee97894e-kube-api-access-sf9j2\") pod \"infra-operator-controller-manager-658588b8c9-625x7\" (UID: \"1e501127-6403-4475-896e-52efee97894e\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:23 crc kubenswrapper[4763]: E1006 15:09:23.659500 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 06 15:09:23 crc kubenswrapper[4763]: E1006 15:09:23.659566 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e501127-6403-4475-896e-52efee97894e-cert podName:1e501127-6403-4475-896e-52efee97894e nodeName:}" failed. No retries permitted until 2025-10-06 15:09:24.159545203 +0000 UTC m=+961.314837755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1e501127-6403-4475-896e-52efee97894e-cert") pod "infra-operator-controller-manager-658588b8c9-625x7" (UID: "1e501127-6403-4475-896e-52efee97894e") : secret "infra-operator-webhook-server-cert" not found Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.659716 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4jww\" (UniqueName: \"kubernetes.io/projected/97d973de-bf67-44c5-b76c-dcb36cab65b4-kube-api-access-b4jww\") pod \"ironic-operator-controller-manager-649675d675-x7zlp\" (UID: \"97d973de-bf67-44c5-b76c-dcb36cab65b4\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.660006 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdlmc\" (UniqueName: \"kubernetes.io/projected/0c3ffc96-39cf-4c8d-9b14-90293eabb117-kube-api-access-fdlmc\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-7hqwv\" (UID: \"0c3ffc96-39cf-4c8d-9b14-90293eabb117\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.660046 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548zl\" (UniqueName: \"kubernetes.io/projected/4a94b2e1-c815-45d3-b7ac-aa7b632af0ef-kube-api-access-548zl\") pod \"heat-operator-controller-manager-54b4974c45-pc484\" (UID: \"4a94b2e1-c815-45d3-b7ac-aa7b632af0ef\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.660082 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5474\" (UniqueName: \"kubernetes.io/projected/af8be53e-818e-4c1e-a90c-800b02a679e4-kube-api-access-z5474\") pod \"manila-operator-controller-manager-65d89cfd9f-lwhfc\" (UID: \"af8be53e-818e-4c1e-a90c-800b02a679e4\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.660107 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjwq9\" (UniqueName: \"kubernetes.io/projected/e5c474ec-88f8-4c60-a866-fcc48cf5bcec-kube-api-access-pjwq9\") pod \"horizon-operator-controller-manager-76d5b87f47-s54qg\" (UID: \"e5c474ec-88f8-4c60-a866-fcc48cf5bcec\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.670260 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-x8z9k" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.675532 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.675634 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.685327 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.686352 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.695381 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf9j2\" (UniqueName: \"kubernetes.io/projected/1e501127-6403-4475-896e-52efee97894e-kube-api-access-sf9j2\") pod \"infra-operator-controller-manager-658588b8c9-625x7\" (UID: \"1e501127-6403-4475-896e-52efee97894e\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.695788 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-pjf47" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.696066 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6fb8j" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.696543 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.706778 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjwq9\" (UniqueName: \"kubernetes.io/projected/e5c474ec-88f8-4c60-a866-fcc48cf5bcec-kube-api-access-pjwq9\") pod \"horizon-operator-controller-manager-76d5b87f47-s54qg\" (UID: \"e5c474ec-88f8-4c60-a866-fcc48cf5bcec\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.725700 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.725704 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4jww\" (UniqueName: \"kubernetes.io/projected/97d973de-bf67-44c5-b76c-dcb36cab65b4-kube-api-access-b4jww\") pod \"ironic-operator-controller-manager-649675d675-x7zlp\" (UID: \"97d973de-bf67-44c5-b76c-dcb36cab65b4\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.726274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548zl\" (UniqueName: \"kubernetes.io/projected/4a94b2e1-c815-45d3-b7ac-aa7b632af0ef-kube-api-access-548zl\") pod \"heat-operator-controller-manager-54b4974c45-pc484\" (UID: \"4a94b2e1-c815-45d3-b7ac-aa7b632af0ef\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.739935 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.741225 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.751857 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-ltpcw" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.757692 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.761576 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lmzq\" (UniqueName: \"kubernetes.io/projected/5f6171a0-9860-4ea2-b850-dc4a67f25499-kube-api-access-5lmzq\") pod \"neutron-operator-controller-manager-8d984cc4d-dp6zq\" (UID: \"5f6171a0-9860-4ea2-b850-dc4a67f25499\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.761654 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5474\" (UniqueName: \"kubernetes.io/projected/af8be53e-818e-4c1e-a90c-800b02a679e4-kube-api-access-z5474\") pod \"manila-operator-controller-manager-65d89cfd9f-lwhfc\" (UID: \"af8be53e-818e-4c1e-a90c-800b02a679e4\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.761711 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt6xm\" (UniqueName: \"kubernetes.io/projected/96be297b-bf5c-4abe-853f-8562201ec721-kube-api-access-lt6xm\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd\" (UID: \"96be297b-bf5c-4abe-853f-8562201ec721\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.761762 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdlmc\" (UniqueName: \"kubernetes.io/projected/0c3ffc96-39cf-4c8d-9b14-90293eabb117-kube-api-access-fdlmc\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-7hqwv\" (UID: \"0c3ffc96-39cf-4c8d-9b14-90293eabb117\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.762106 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.762723 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zpznm" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.763025 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.768243 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-w9nbh" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.771446 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.772993 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rdbt6" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.776890 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdlmc\" (UniqueName: \"kubernetes.io/projected/0c3ffc96-39cf-4c8d-9b14-90293eabb117-kube-api-access-fdlmc\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-7hqwv\" (UID: \"0c3ffc96-39cf-4c8d-9b14-90293eabb117\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.780219 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.785435 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.782641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5474\" (UniqueName: \"kubernetes.io/projected/af8be53e-818e-4c1e-a90c-800b02a679e4-kube-api-access-z5474\") pod \"manila-operator-controller-manager-65d89cfd9f-lwhfc\" (UID: \"af8be53e-818e-4c1e-a90c-800b02a679e4\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.799960 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.824132 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.832944 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-vgtcn" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.833770 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-j72bj" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.837528 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.844459 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.848076 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.852338 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.853584 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.855418 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.855652 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5m7wf" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.863392 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lmzq\" (UniqueName: \"kubernetes.io/projected/5f6171a0-9860-4ea2-b850-dc4a67f25499-kube-api-access-5lmzq\") pod \"neutron-operator-controller-manager-8d984cc4d-dp6zq\" (UID: \"5f6171a0-9860-4ea2-b850-dc4a67f25499\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.863442 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be5ee6ad-dda0-4cb8-8c8a-e8051083873a-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn\" (UID: \"be5ee6ad-dda0-4cb8-8c8a-e8051083873a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.863469 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jvrj\" (UniqueName: \"kubernetes.io/projected/be5ee6ad-dda0-4cb8-8c8a-e8051083873a-kube-api-access-2jvrj\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn\" (UID: \"be5ee6ad-dda0-4cb8-8c8a-e8051083873a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.863511 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rccmp\" (UniqueName: \"kubernetes.io/projected/216ab588-0962-451e-8481-2dff1de05a59-kube-api-access-rccmp\") pod \"octavia-operator-controller-manager-7468f855d8-th4tv\" (UID: \"216ab588-0962-451e-8481-2dff1de05a59\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.863542 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt6xm\" (UniqueName: \"kubernetes.io/projected/96be297b-bf5c-4abe-853f-8562201ec721-kube-api-access-lt6xm\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd\" (UID: \"96be297b-bf5c-4abe-853f-8562201ec721\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.863559 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w4hd\" (UniqueName: \"kubernetes.io/projected/a0c790cf-619e-4986-8dbb-f9ed71be08c7-kube-api-access-2w4hd\") pod \"nova-operator-controller-manager-7c7fc454ff-m4l5b\" (UID: \"a0c790cf-619e-4986-8dbb-f9ed71be08c7\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.863913 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.864989 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.867051 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hr5kz" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.880936 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.881255 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt6xm\" (UniqueName: \"kubernetes.io/projected/96be297b-bf5c-4abe-853f-8562201ec721-kube-api-access-lt6xm\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd\" (UID: \"96be297b-bf5c-4abe-853f-8562201ec721\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.883535 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.884352 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lmzq\" (UniqueName: \"kubernetes.io/projected/5f6171a0-9860-4ea2-b850-dc4a67f25499-kube-api-access-5lmzq\") pod \"neutron-operator-controller-manager-8d984cc4d-dp6zq\" (UID: \"5f6171a0-9860-4ea2-b850-dc4a67f25499\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.886267 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gss4r" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.891504 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.892477 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.896786 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-tnhd4" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.918557 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.925963 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.940780 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.953146 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.963696 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.972252 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.973335 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.975561 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-g46br" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.978964 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf"] Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.984211 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.985396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjwm8\" (UniqueName: \"kubernetes.io/projected/4afc1cb2-2dde-439a-88ea-8aaeedbadc53-kube-api-access-zjwm8\") pod \"ovn-operator-controller-manager-6d8b6f9b9-zlwcn\" (UID: \"4afc1cb2-2dde-439a-88ea-8aaeedbadc53\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.985427 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be5ee6ad-dda0-4cb8-8c8a-e8051083873a-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn\" (UID: \"be5ee6ad-dda0-4cb8-8c8a-e8051083873a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.985455 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jvrj\" (UniqueName: \"kubernetes.io/projected/be5ee6ad-dda0-4cb8-8c8a-e8051083873a-kube-api-access-2jvrj\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn\" (UID: \"be5ee6ad-dda0-4cb8-8c8a-e8051083873a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.985495 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rccmp\" (UniqueName: \"kubernetes.io/projected/216ab588-0962-451e-8481-2dff1de05a59-kube-api-access-rccmp\") pod \"octavia-operator-controller-manager-7468f855d8-th4tv\" (UID: \"216ab588-0962-451e-8481-2dff1de05a59\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.984100 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" Oct 06 15:09:23 crc kubenswrapper[4763]: I1006 15:09:23.985524 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w4hd\" (UniqueName: \"kubernetes.io/projected/a0c790cf-619e-4986-8dbb-f9ed71be08c7-kube-api-access-2w4hd\") pod \"nova-operator-controller-manager-7c7fc454ff-m4l5b\" (UID: \"a0c790cf-619e-4986-8dbb-f9ed71be08c7\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" Oct 06 15:09:23 crc kubenswrapper[4763]: E1006 15:09:23.986539 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 15:09:23 crc kubenswrapper[4763]: E1006 15:09:23.987279 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5ee6ad-dda0-4cb8-8c8a-e8051083873a-cert podName:be5ee6ad-dda0-4cb8-8c8a-e8051083873a nodeName:}" failed. No retries permitted until 2025-10-06 15:09:24.487263153 +0000 UTC m=+961.642555665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be5ee6ad-dda0-4cb8-8c8a-e8051083873a-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" (UID: "be5ee6ad-dda0-4cb8-8c8a-e8051083873a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.000559 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.007905 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rccmp\" (UniqueName: \"kubernetes.io/projected/216ab588-0962-451e-8481-2dff1de05a59-kube-api-access-rccmp\") pod \"octavia-operator-controller-manager-7468f855d8-th4tv\" (UID: \"216ab588-0962-451e-8481-2dff1de05a59\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.009892 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jvrj\" (UniqueName: \"kubernetes.io/projected/be5ee6ad-dda0-4cb8-8c8a-e8051083873a-kube-api-access-2jvrj\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn\" (UID: \"be5ee6ad-dda0-4cb8-8c8a-e8051083873a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.015083 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w4hd\" (UniqueName: \"kubernetes.io/projected/a0c790cf-619e-4986-8dbb-f9ed71be08c7-kube-api-access-2w4hd\") pod \"nova-operator-controller-manager-7c7fc454ff-m4l5b\" (UID: \"a0c790cf-619e-4986-8dbb-f9ed71be08c7\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.015183 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.015262 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.026318 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qmvp9" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.059157 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.068052 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.069165 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.076544 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.077243 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vzgwn" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.087508 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g56s8\" (UniqueName: \"kubernetes.io/projected/b754fcb6-88a9-42a9-9cfc-e624fe6d1afa-kube-api-access-g56s8\") pod \"telemetry-operator-controller-manager-5d4d74dd89-nnrgf\" (UID: \"b754fcb6-88a9-42a9-9cfc-e624fe6d1afa\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.087696 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5vj7\" (UniqueName: \"kubernetes.io/projected/948b7d9f-215b-47d3-b8dc-a953d26e9041-kube-api-access-l5vj7\") pod \"watcher-operator-controller-manager-6cbc6dd547-jsb9x\" (UID: \"948b7d9f-215b-47d3-b8dc-a953d26e9041\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.087896 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjwm8\" (UniqueName: \"kubernetes.io/projected/4afc1cb2-2dde-439a-88ea-8aaeedbadc53-kube-api-access-zjwm8\") pod \"ovn-operator-controller-manager-6d8b6f9b9-zlwcn\" (UID: \"4afc1cb2-2dde-439a-88ea-8aaeedbadc53\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.087917 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndqxj\" (UniqueName: \"kubernetes.io/projected/1e2641ba-13ac-4406-a4a7-79d2a4e7bafd-kube-api-access-ndqxj\") pod \"swift-operator-controller-manager-6859f9b676-brpsw\" (UID: \"1e2641ba-13ac-4406-a4a7-79d2a4e7bafd\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.087942 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7ws4\" (UniqueName: \"kubernetes.io/projected/d7407f54-5a8a-4aa2-a65c-5cba9feedd94-kube-api-access-q7ws4\") pod \"placement-operator-controller-manager-54689d9f88-hbvnb\" (UID: \"d7407f54-5a8a-4aa2-a65c-5cba9feedd94\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.088798 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.099479 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.100636 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.103630 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.104419 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-ffpz7" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.107579 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjwm8\" (UniqueName: \"kubernetes.io/projected/4afc1cb2-2dde-439a-88ea-8aaeedbadc53-kube-api-access-zjwm8\") pod \"ovn-operator-controller-manager-6d8b6f9b9-zlwcn\" (UID: \"4afc1cb2-2dde-439a-88ea-8aaeedbadc53\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.107854 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.132201 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.133051 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.137917 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-l6v54" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.155848 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.191130 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e501127-6403-4475-896e-52efee97894e-cert\") pod \"infra-operator-controller-manager-658588b8c9-625x7\" (UID: \"1e501127-6403-4475-896e-52efee97894e\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.191216 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjfcx\" (UniqueName: \"kubernetes.io/projected/7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f-kube-api-access-xjfcx\") pod \"test-operator-controller-manager-5cd5cb47d7-w99ph\" (UID: \"7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.191272 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndqxj\" (UniqueName: \"kubernetes.io/projected/1e2641ba-13ac-4406-a4a7-79d2a4e7bafd-kube-api-access-ndqxj\") pod \"swift-operator-controller-manager-6859f9b676-brpsw\" (UID: \"1e2641ba-13ac-4406-a4a7-79d2a4e7bafd\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.191296 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7ws4\" (UniqueName: \"kubernetes.io/projected/d7407f54-5a8a-4aa2-a65c-5cba9feedd94-kube-api-access-q7ws4\") pod \"placement-operator-controller-manager-54689d9f88-hbvnb\" (UID: \"d7407f54-5a8a-4aa2-a65c-5cba9feedd94\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.191358 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g56s8\" (UniqueName: \"kubernetes.io/projected/b754fcb6-88a9-42a9-9cfc-e624fe6d1afa-kube-api-access-g56s8\") pod \"telemetry-operator-controller-manager-5d4d74dd89-nnrgf\" (UID: \"b754fcb6-88a9-42a9-9cfc-e624fe6d1afa\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.191379 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5vj7\" (UniqueName: \"kubernetes.io/projected/948b7d9f-215b-47d3-b8dc-a953d26e9041-kube-api-access-l5vj7\") pod \"watcher-operator-controller-manager-6cbc6dd547-jsb9x\" (UID: \"948b7d9f-215b-47d3-b8dc-a953d26e9041\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.217946 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.222757 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e501127-6403-4475-896e-52efee97894e-cert\") pod \"infra-operator-controller-manager-658588b8c9-625x7\" (UID: \"1e501127-6403-4475-896e-52efee97894e\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.228933 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5vj7\" (UniqueName: \"kubernetes.io/projected/948b7d9f-215b-47d3-b8dc-a953d26e9041-kube-api-access-l5vj7\") pod \"watcher-operator-controller-manager-6cbc6dd547-jsb9x\" (UID: \"948b7d9f-215b-47d3-b8dc-a953d26e9041\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.238206 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndqxj\" (UniqueName: \"kubernetes.io/projected/1e2641ba-13ac-4406-a4a7-79d2a4e7bafd-kube-api-access-ndqxj\") pod \"swift-operator-controller-manager-6859f9b676-brpsw\" (UID: \"1e2641ba-13ac-4406-a4a7-79d2a4e7bafd\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.255231 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.258442 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7ws4\" (UniqueName: \"kubernetes.io/projected/d7407f54-5a8a-4aa2-a65c-5cba9feedd94-kube-api-access-q7ws4\") pod \"placement-operator-controller-manager-54689d9f88-hbvnb\" (UID: \"d7407f54-5a8a-4aa2-a65c-5cba9feedd94\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.260181 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g56s8\" (UniqueName: \"kubernetes.io/projected/b754fcb6-88a9-42a9-9cfc-e624fe6d1afa-kube-api-access-g56s8\") pod \"telemetry-operator-controller-manager-5d4d74dd89-nnrgf\" (UID: \"b754fcb6-88a9-42a9-9cfc-e624fe6d1afa\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.293330 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l467\" (UniqueName: \"kubernetes.io/projected/4e80e79e-1bcd-40f5-a520-f8a850bd5cf5-kube-api-access-8l467\") pod \"openstack-operator-controller-manager-847bc59d9d-vvb5x\" (UID: \"4e80e79e-1bcd-40f5-a520-f8a850bd5cf5\") " pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.293709 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e80e79e-1bcd-40f5-a520-f8a850bd5cf5-cert\") pod \"openstack-operator-controller-manager-847bc59d9d-vvb5x\" (UID: \"4e80e79e-1bcd-40f5-a520-f8a850bd5cf5\") " pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.293958 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nncx8\" (UniqueName: \"kubernetes.io/projected/c48055a7-5392-47b3-a30c-f2f97c8463cc-kube-api-access-nncx8\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-69cq6\" (UID: \"c48055a7-5392-47b3-a30c-f2f97c8463cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.294065 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjfcx\" (UniqueName: \"kubernetes.io/projected/7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f-kube-api-access-xjfcx\") pod \"test-operator-controller-manager-5cd5cb47d7-w99ph\" (UID: \"7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.306443 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.312769 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjfcx\" (UniqueName: \"kubernetes.io/projected/7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f-kube-api-access-xjfcx\") pod \"test-operator-controller-manager-5cd5cb47d7-w99ph\" (UID: \"7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.316556 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.338931 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.344259 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.399357 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nncx8\" (UniqueName: \"kubernetes.io/projected/c48055a7-5392-47b3-a30c-f2f97c8463cc-kube-api-access-nncx8\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-69cq6\" (UID: \"c48055a7-5392-47b3-a30c-f2f97c8463cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.399443 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l467\" (UniqueName: \"kubernetes.io/projected/4e80e79e-1bcd-40f5-a520-f8a850bd5cf5-kube-api-access-8l467\") pod \"openstack-operator-controller-manager-847bc59d9d-vvb5x\" (UID: \"4e80e79e-1bcd-40f5-a520-f8a850bd5cf5\") " pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.399504 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e80e79e-1bcd-40f5-a520-f8a850bd5cf5-cert\") pod \"openstack-operator-controller-manager-847bc59d9d-vvb5x\" (UID: \"4e80e79e-1bcd-40f5-a520-f8a850bd5cf5\") " pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:24 crc kubenswrapper[4763]: E1006 15:09:24.399694 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 06 15:09:24 crc kubenswrapper[4763]: E1006 15:09:24.399756 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e80e79e-1bcd-40f5-a520-f8a850bd5cf5-cert podName:4e80e79e-1bcd-40f5-a520-f8a850bd5cf5 nodeName:}" failed. No retries permitted until 2025-10-06 15:09:24.899736037 +0000 UTC m=+962.055028549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e80e79e-1bcd-40f5-a520-f8a850bd5cf5-cert") pod "openstack-operator-controller-manager-847bc59d9d-vvb5x" (UID: "4e80e79e-1bcd-40f5-a520-f8a850bd5cf5") : secret "webhook-server-cert" not found Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.419592 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.421951 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nncx8\" (UniqueName: \"kubernetes.io/projected/c48055a7-5392-47b3-a30c-f2f97c8463cc-kube-api-access-nncx8\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-69cq6\" (UID: \"c48055a7-5392-47b3-a30c-f2f97c8463cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.424092 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l467\" (UniqueName: \"kubernetes.io/projected/4e80e79e-1bcd-40f5-a520-f8a850bd5cf5-kube-api-access-8l467\") pod \"openstack-operator-controller-manager-847bc59d9d-vvb5x\" (UID: \"4e80e79e-1bcd-40f5-a520-f8a850bd5cf5\") " pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.454906 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.458058 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.492572 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.505039 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be5ee6ad-dda0-4cb8-8c8a-e8051083873a-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn\" (UID: \"be5ee6ad-dda0-4cb8-8c8a-e8051083873a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.510912 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be5ee6ad-dda0-4cb8-8c8a-e8051083873a-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn\" (UID: \"be5ee6ad-dda0-4cb8-8c8a-e8051083873a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.525045 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.533373 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.581869 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.587320 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.682903 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-pc484"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.697944 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.700133 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.783305 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.814315 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd"] Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.910357 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e80e79e-1bcd-40f5-a520-f8a850bd5cf5-cert\") pod \"openstack-operator-controller-manager-847bc59d9d-vvb5x\" (UID: \"4e80e79e-1bcd-40f5-a520-f8a850bd5cf5\") " pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.916366 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e80e79e-1bcd-40f5-a520-f8a850bd5cf5-cert\") pod \"openstack-operator-controller-manager-847bc59d9d-vvb5x\" (UID: \"4e80e79e-1bcd-40f5-a520-f8a850bd5cf5\") " pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:24 crc kubenswrapper[4763]: I1006 15:09:24.955225 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc"] Oct 06 15:09:24 crc kubenswrapper[4763]: W1006 15:09:24.962586 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf8be53e_818e_4c1e_a90c_800b02a679e4.slice/crio-f967a094fd9b9fc11179b64ef3e698ad7c33ab7bb7bb4acb5de37bca15df4684 WatchSource:0}: Error finding container f967a094fd9b9fc11179b64ef3e698ad7c33ab7bb7bb4acb5de37bca15df4684: Status 404 returned error can't find the container with id f967a094fd9b9fc11179b64ef3e698ad7c33ab7bb7bb4acb5de37bca15df4684 Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.055040 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" event={"ID":"96be297b-bf5c-4abe-853f-8562201ec721","Type":"ContainerStarted","Data":"85935f38e509b5af4be13ca9891db2a8ad889bdd8200ca9a18e11bea25bf0744"} Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.057119 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" event={"ID":"97d973de-bf67-44c5-b76c-dcb36cab65b4","Type":"ContainerStarted","Data":"314621a6ceb00b45378a8781f6b164d572d9f6d9e4c69f13f3da5b90db82c923"} Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.058023 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" event={"ID":"f76d9cdd-3955-4727-87b4-20bf248be0f2","Type":"ContainerStarted","Data":"c2e82d5e8fce678189e447061f0be8296992920112068897097b3dd7e4574608"} Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.059351 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" event={"ID":"e5c474ec-88f8-4c60-a866-fcc48cf5bcec","Type":"ContainerStarted","Data":"2ca7713c5f0b12b0415914f3b45a1961e016d94f1e09b9e11122a5c3e9f5f911"} Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.060404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" event={"ID":"af8be53e-818e-4c1e-a90c-800b02a679e4","Type":"ContainerStarted","Data":"f967a094fd9b9fc11179b64ef3e698ad7c33ab7bb7bb4acb5de37bca15df4684"} Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.061713 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" event={"ID":"dd4eb024-6f04-4ee5-a485-78311ddee488","Type":"ContainerStarted","Data":"00b3190d6a2261ede842d434cb8d6dbfd8a308f4c1b168f43c5e316d1fa1c1ec"} Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.062799 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" event={"ID":"2c0fa69d-0c27-48bc-b120-e1416445de40","Type":"ContainerStarted","Data":"f1f72823bcb89131fccf912a3737972f8697f7fc2a0382093564aeaab0796e38"} Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.063820 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" event={"ID":"64a78de0-4ad4-40f2-bf1c-a830f4c32dd1","Type":"ContainerStarted","Data":"3222e91579ac10e8a43403ea3d5e6869db8c01209ed59a6ba4cf99c0ef628ddb"} Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.064980 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" event={"ID":"4a94b2e1-c815-45d3-b7ac-aa7b632af0ef","Type":"ContainerStarted","Data":"bd83f91f4c38233a57ec1cdc2716894ee06f6f21cedfab7657687e8c79220afd"} Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.121031 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.163191 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv"] Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.169473 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv"] Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.179183 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq"] Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.183605 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b"] Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.188837 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb"] Oct 06 15:09:25 crc kubenswrapper[4763]: W1006 15:09:25.190847 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c3ffc96_39cf_4c8d_9b14_90293eabb117.slice/crio-deff4a872edc78576c914eb4224de96b3b30c481c083300b71177a74928d3aae WatchSource:0}: Error finding container deff4a872edc78576c914eb4224de96b3b30c481c083300b71177a74928d3aae: Status 404 returned error can't find the container with id deff4a872edc78576c914eb4224de96b3b30c481c083300b71177a74928d3aae Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.192730 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn"] Oct 06 15:09:25 crc kubenswrapper[4763]: W1006 15:09:25.194255 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f6171a0_9860_4ea2_b850_dc4a67f25499.slice/crio-570b0f11f0426a4f03207d73c26a008837a4b34f03fadd52dbf1045243849678 WatchSource:0}: Error finding container 570b0f11f0426a4f03207d73c26a008837a4b34f03fadd52dbf1045243849678: Status 404 returned error can't find the container with id 570b0f11f0426a4f03207d73c26a008837a4b34f03fadd52dbf1045243849678 Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.196366 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw"] Oct 06 15:09:25 crc kubenswrapper[4763]: W1006 15:09:25.198778 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c790cf_619e_4986_8dbb_f9ed71be08c7.slice/crio-32f2871e06d969503e7eb59b0eb0b8deec8714140864b9d2b15cf9db29cc37c6 WatchSource:0}: Error finding container 32f2871e06d969503e7eb59b0eb0b8deec8714140864b9d2b15cf9db29cc37c6: Status 404 returned error can't find the container with id 32f2871e06d969503e7eb59b0eb0b8deec8714140864b9d2b15cf9db29cc37c6 Oct 06 15:09:25 crc kubenswrapper[4763]: W1006 15:09:25.209225 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4afc1cb2_2dde_439a_88ea_8aaeedbadc53.slice/crio-0b58e73a7b9a8503ca6ff2f1d7fd99d2c81b6c2347aef51934189ab9d7fb1984 WatchSource:0}: Error finding container 0b58e73a7b9a8503ca6ff2f1d7fd99d2c81b6c2347aef51934189ab9d7fb1984: Status 404 returned error can't find the container with id 0b58e73a7b9a8503ca6ff2f1d7fd99d2c81b6c2347aef51934189ab9d7fb1984 Oct 06 15:09:25 crc kubenswrapper[4763]: W1006 15:09:25.217120 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod216ab588_0962_451e_8481_2dff1de05a59.slice/crio-88d6fd2041a86d3500a52c14d12c2db8375322423a1e648d3e221c1fe344ed1b WatchSource:0}: Error finding container 88d6fd2041a86d3500a52c14d12c2db8375322423a1e648d3e221c1fe344ed1b: Status 404 returned error can't find the container with id 88d6fd2041a86d3500a52c14d12c2db8375322423a1e648d3e221c1fe344ed1b Oct 06 15:09:25 crc kubenswrapper[4763]: W1006 15:09:25.218844 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7407f54_5a8a_4aa2_a65c_5cba9feedd94.slice/crio-4664559cb07d7afd8673d45feec8e20289250cfbb21033e58474c51ebe05a020 WatchSource:0}: Error finding container 4664559cb07d7afd8673d45feec8e20289250cfbb21033e58474c51ebe05a020: Status 404 returned error can't find the container with id 4664559cb07d7afd8673d45feec8e20289250cfbb21033e58474c51ebe05a020 Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.220072 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rccmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7468f855d8-th4tv_openstack-operators(216ab588-0962-451e-8481-2dff1de05a59): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 15:09:25 crc kubenswrapper[4763]: W1006 15:09:25.221093 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e2641ba_13ac_4406_a4a7_79d2a4e7bafd.slice/crio-c18fa3ca0a10c1f9b479cc8022e8f504186af6ee2be7d8b371acce3cba374d53 WatchSource:0}: Error finding container c18fa3ca0a10c1f9b479cc8022e8f504186af6ee2be7d8b371acce3cba374d53: Status 404 returned error can't find the container with id c18fa3ca0a10c1f9b479cc8022e8f504186af6ee2be7d8b371acce3cba374d53 Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.223213 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ndqxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-brpsw_openstack-operators(1e2641ba-13ac-4406-a4a7-79d2a4e7bafd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.292926 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph"] Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.304805 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-625x7"] Oct 06 15:09:25 crc kubenswrapper[4763]: W1006 15:09:25.313906 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e66e7e7_1f92_49f9_b27f_a2e0c7ec568f.slice/crio-546fed82109c5b61027a48fc72a9a01049dec418a7e0b02c896fde689b7aca6a WatchSource:0}: Error finding container 546fed82109c5b61027a48fc72a9a01049dec418a7e0b02c896fde689b7aca6a: Status 404 returned error can't find the container with id 546fed82109c5b61027a48fc72a9a01049dec418a7e0b02c896fde689b7aca6a Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.316207 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xjfcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-w99ph_openstack-operators(7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.317939 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf"] Oct 06 15:09:25 crc kubenswrapper[4763]: W1006 15:09:25.320932 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e501127_6403_4475_896e_52efee97894e.slice/crio-fb5183c9fa2b0b767641f11fd109a8e89bd2e469139e7e8d5a8dd2ffd5cc4c46 WatchSource:0}: Error finding container fb5183c9fa2b0b767641f11fd109a8e89bd2e469139e7e8d5a8dd2ffd5cc4c46: Status 404 returned error can't find the container with id fb5183c9fa2b0b767641f11fd109a8e89bd2e469139e7e8d5a8dd2ffd5cc4c46 Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.325885 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x"] Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.331708 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6"] Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.331904 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sf9j2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-658588b8c9-625x7_openstack-operators(1e501127-6403-4475-896e-52efee97894e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.331969 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g56s8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5d4d74dd89-nnrgf_openstack-operators(b754fcb6-88a9-42a9-9cfc-e624fe6d1afa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.335434 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn"] Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.346734 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nncx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-69cq6_openstack-operators(c48055a7-5392-47b3-a30c-f2f97c8463cc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.347552 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l5vj7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-jsb9x_openstack-operators(948b7d9f-215b-47d3-b8dc-a953d26e9041): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.347833 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" podUID="c48055a7-5392-47b3-a30c-f2f97c8463cc" Oct 06 15:09:25 crc kubenswrapper[4763]: W1006 15:09:25.360254 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5ee6ad_dda0_4cb8_8c8a_e8051083873a.slice/crio-4f78ce711137c2678db87f422de21e33effa4b43dfdd59096df504951ca11ea0 WatchSource:0}: Error finding container 4f78ce711137c2678db87f422de21e33effa4b43dfdd59096df504951ca11ea0: Status 404 returned error can't find the container with id 4f78ce711137c2678db87f422de21e33effa4b43dfdd59096df504951ca11ea0 Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.367230 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn_openstack-operators(be5ee6ad-dda0-4cb8-8c8a-e8051083873a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.459337 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" podUID="1e2641ba-13ac-4406-a4a7-79d2a4e7bafd" Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.462840 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" podUID="216ab588-0962-451e-8481-2dff1de05a59" Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.571300 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" podUID="7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f" Oct 06 15:09:25 crc kubenswrapper[4763]: I1006 15:09:25.630048 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x"] Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.711575 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" podUID="948b7d9f-215b-47d3-b8dc-a953d26e9041" Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.805486 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" podUID="1e501127-6403-4475-896e-52efee97894e" Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.809556 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" podUID="be5ee6ad-dda0-4cb8-8c8a-e8051083873a" Oct 06 15:09:25 crc kubenswrapper[4763]: E1006 15:09:25.856003 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" podUID="b754fcb6-88a9-42a9-9cfc-e624fe6d1afa" Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.073484 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" event={"ID":"1e501127-6403-4475-896e-52efee97894e","Type":"ContainerStarted","Data":"e96105d0ab74b008d70947f22a6a1b80c0c502dadf477c6bf5ff392dc7458937"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.073539 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" event={"ID":"1e501127-6403-4475-896e-52efee97894e","Type":"ContainerStarted","Data":"fb5183c9fa2b0b767641f11fd109a8e89bd2e469139e7e8d5a8dd2ffd5cc4c46"} Oct 06 15:09:26 crc kubenswrapper[4763]: E1006 15:09:26.075997 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" podUID="1e501127-6403-4475-896e-52efee97894e" Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.078386 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" event={"ID":"b754fcb6-88a9-42a9-9cfc-e624fe6d1afa","Type":"ContainerStarted","Data":"434413dbdfa8667fa613c029ea55235114549904421deff109c8fc18a36bc8cc"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.078420 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" event={"ID":"b754fcb6-88a9-42a9-9cfc-e624fe6d1afa","Type":"ContainerStarted","Data":"247c7887da4bb32ba56385c9571c4dae3444ce0847be871d0118453bb2340354"} Oct 06 15:09:26 crc kubenswrapper[4763]: E1006 15:09:26.082988 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" podUID="b754fcb6-88a9-42a9-9cfc-e624fe6d1afa" Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.109892 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" event={"ID":"948b7d9f-215b-47d3-b8dc-a953d26e9041","Type":"ContainerStarted","Data":"34925117f4122a50534090c8de5f7c4d567ff5bc10f4ecd1ce6ddd468ffb84e4"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.109941 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" event={"ID":"948b7d9f-215b-47d3-b8dc-a953d26e9041","Type":"ContainerStarted","Data":"b47455e6c5472e09a23ef2a51e67edafe2d2d1851178caff0b587519f305fa4b"} Oct 06 15:09:26 crc kubenswrapper[4763]: E1006 15:09:26.128975 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" podUID="948b7d9f-215b-47d3-b8dc-a953d26e9041" Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.135880 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" event={"ID":"be5ee6ad-dda0-4cb8-8c8a-e8051083873a","Type":"ContainerStarted","Data":"9c15b3af34076b4f5f06df3cf32884d816e735d9a76388440bb9827c48b4060e"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.135921 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" event={"ID":"be5ee6ad-dda0-4cb8-8c8a-e8051083873a","Type":"ContainerStarted","Data":"4f78ce711137c2678db87f422de21e33effa4b43dfdd59096df504951ca11ea0"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.144716 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" event={"ID":"0c3ffc96-39cf-4c8d-9b14-90293eabb117","Type":"ContainerStarted","Data":"deff4a872edc78576c914eb4224de96b3b30c481c083300b71177a74928d3aae"} Oct 06 15:09:26 crc kubenswrapper[4763]: E1006 15:09:26.158050 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" podUID="be5ee6ad-dda0-4cb8-8c8a-e8051083873a" Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.158168 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" event={"ID":"4e80e79e-1bcd-40f5-a520-f8a850bd5cf5","Type":"ContainerStarted","Data":"941ddbdb469624e24e826b8c7ab13bdf5012bc53c4007cfe93d27bc5ff96071c"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.159795 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.159819 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" event={"ID":"4e80e79e-1bcd-40f5-a520-f8a850bd5cf5","Type":"ContainerStarted","Data":"4086f580668f565421bdba695d90b0f964a1265a5ff01dba02b550f4a99848ac"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.159833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" event={"ID":"4e80e79e-1bcd-40f5-a520-f8a850bd5cf5","Type":"ContainerStarted","Data":"caed5fec56e855d361da77e853447459a819bef40fe283ade79b02438a8e133d"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.160835 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" event={"ID":"5f6171a0-9860-4ea2-b850-dc4a67f25499","Type":"ContainerStarted","Data":"570b0f11f0426a4f03207d73c26a008837a4b34f03fadd52dbf1045243849678"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.185524 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" event={"ID":"d7407f54-5a8a-4aa2-a65c-5cba9feedd94","Type":"ContainerStarted","Data":"4664559cb07d7afd8673d45feec8e20289250cfbb21033e58474c51ebe05a020"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.227155 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" event={"ID":"c48055a7-5392-47b3-a30c-f2f97c8463cc","Type":"ContainerStarted","Data":"0b07b160ade57032bce80ca4897f15aa157ed5c4f03431d67dfd71b54056a51f"} Oct 06 15:09:26 crc kubenswrapper[4763]: E1006 15:09:26.228749 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" podUID="c48055a7-5392-47b3-a30c-f2f97c8463cc" Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.235238 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" event={"ID":"1e2641ba-13ac-4406-a4a7-79d2a4e7bafd","Type":"ContainerStarted","Data":"650e7a8b8481daa10672f196f5aca6d8dd32b3a5e28d4ffa72843cdf65985c66"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.235286 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" event={"ID":"1e2641ba-13ac-4406-a4a7-79d2a4e7bafd","Type":"ContainerStarted","Data":"c18fa3ca0a10c1f9b479cc8022e8f504186af6ee2be7d8b371acce3cba374d53"} Oct 06 15:09:26 crc kubenswrapper[4763]: E1006 15:09:26.236764 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" podUID="1e2641ba-13ac-4406-a4a7-79d2a4e7bafd" Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.275330 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" podStartSLOduration=2.27530036 podStartE2EDuration="2.27530036s" podCreationTimestamp="2025-10-06 15:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:09:26.21864217 +0000 UTC m=+963.373934682" watchObservedRunningTime="2025-10-06 15:09:26.27530036 +0000 UTC m=+963.430592872" Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.276042 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" event={"ID":"216ab588-0962-451e-8481-2dff1de05a59","Type":"ContainerStarted","Data":"a121c2aded8b7bf2979cb4883500ab5daeb957872e3d8c12d1527341b296b7a8"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.276090 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" event={"ID":"216ab588-0962-451e-8481-2dff1de05a59","Type":"ContainerStarted","Data":"88d6fd2041a86d3500a52c14d12c2db8375322423a1e648d3e221c1fe344ed1b"} Oct 06 15:09:26 crc kubenswrapper[4763]: E1006 15:09:26.305203 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" podUID="216ab588-0962-451e-8481-2dff1de05a59" Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.309080 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" event={"ID":"4afc1cb2-2dde-439a-88ea-8aaeedbadc53","Type":"ContainerStarted","Data":"0b58e73a7b9a8503ca6ff2f1d7fd99d2c81b6c2347aef51934189ab9d7fb1984"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.314054 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" event={"ID":"a0c790cf-619e-4986-8dbb-f9ed71be08c7","Type":"ContainerStarted","Data":"32f2871e06d969503e7eb59b0eb0b8deec8714140864b9d2b15cf9db29cc37c6"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.323893 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" event={"ID":"7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f","Type":"ContainerStarted","Data":"8864c6b5dd0bf806ee14f7b5d1c8950a9f3f19095178f89f2407b7da49b62f9e"} Oct 06 15:09:26 crc kubenswrapper[4763]: I1006 15:09:26.323936 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" event={"ID":"7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f","Type":"ContainerStarted","Data":"546fed82109c5b61027a48fc72a9a01049dec418a7e0b02c896fde689b7aca6a"} Oct 06 15:09:26 crc kubenswrapper[4763]: E1006 15:09:26.325929 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" podUID="7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f" Oct 06 15:09:27 crc kubenswrapper[4763]: E1006 15:09:27.333185 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" podUID="1e501127-6403-4475-896e-52efee97894e" Oct 06 15:09:27 crc kubenswrapper[4763]: E1006 15:09:27.334902 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" podUID="216ab588-0962-451e-8481-2dff1de05a59" Oct 06 15:09:27 crc kubenswrapper[4763]: E1006 15:09:27.335056 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" podUID="948b7d9f-215b-47d3-b8dc-a953d26e9041" Oct 06 15:09:27 crc kubenswrapper[4763]: E1006 15:09:27.335862 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" podUID="1e2641ba-13ac-4406-a4a7-79d2a4e7bafd" Oct 06 15:09:27 crc kubenswrapper[4763]: E1006 15:09:27.337182 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" podUID="7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f" Oct 06 15:09:27 crc kubenswrapper[4763]: E1006 15:09:27.337225 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" podUID="be5ee6ad-dda0-4cb8-8c8a-e8051083873a" Oct 06 15:09:27 crc kubenswrapper[4763]: E1006 15:09:27.337541 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" podUID="b754fcb6-88a9-42a9-9cfc-e624fe6d1afa" Oct 06 15:09:27 crc kubenswrapper[4763]: E1006 15:09:27.339511 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" podUID="c48055a7-5392-47b3-a30c-f2f97c8463cc" Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.413769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" event={"ID":"a0c790cf-619e-4986-8dbb-f9ed71be08c7","Type":"ContainerStarted","Data":"c15abd9d61f82780e933ec1aaeb38c546d91bc1141995643b3601b45165890aa"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.442860 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" event={"ID":"dd4eb024-6f04-4ee5-a485-78311ddee488","Type":"ContainerStarted","Data":"7728ec56f50ae33727572bfd24db44b0bb5713e171d343f52d558cbc70325ff3"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.456664 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" event={"ID":"d7407f54-5a8a-4aa2-a65c-5cba9feedd94","Type":"ContainerStarted","Data":"d9d28bd335cfe37a425e6620ca388271d3594878146bd0263516c369571874b8"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.456736 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.486541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" event={"ID":"e5c474ec-88f8-4c60-a866-fcc48cf5bcec","Type":"ContainerStarted","Data":"5e00774c1c05415a35ab0f18cc26051aaf8be52c3bbb1cbdb98c829d2a12a017"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.506145 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" event={"ID":"5f6171a0-9860-4ea2-b850-dc4a67f25499","Type":"ContainerStarted","Data":"9fef5e838a3668c8e602530aa6983c429be7f8ffeb160e7a729804167bd02893"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.520185 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" event={"ID":"0c3ffc96-39cf-4c8d-9b14-90293eabb117","Type":"ContainerStarted","Data":"994db239905aef3ccd2450f53c96524882066fac7a966e6f7f77f7302df7d029"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.523665 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" event={"ID":"af8be53e-818e-4c1e-a90c-800b02a679e4","Type":"ContainerStarted","Data":"00fe2633ee2b629ea6c168d2a3afd204b4743795edce9db53598693ffa683db7"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.534407 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" event={"ID":"64a78de0-4ad4-40f2-bf1c-a830f4c32dd1","Type":"ContainerStarted","Data":"59f18bc81c5f78d715345d1eb88efa1fe47883e1456f79de94d460ab21f9a254"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.535225 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.542305 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" event={"ID":"f76d9cdd-3955-4727-87b4-20bf248be0f2","Type":"ContainerStarted","Data":"a7b17dc983ea366600f32a7758671ba157646ccfbe5bfb21b673af48ccd635ec"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.544259 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" event={"ID":"2c0fa69d-0c27-48bc-b120-e1416445de40","Type":"ContainerStarted","Data":"ef00c5d78bdab151ee34e45fe6b77d775daaae388f18d80d656b9f9c45b9ec68"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.547150 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" event={"ID":"4afc1cb2-2dde-439a-88ea-8aaeedbadc53","Type":"ContainerStarted","Data":"306882dbed10587ecd722ef0b7e93181e982aa09d327b3f430f97390f205a005"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.548466 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.549523 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" event={"ID":"96be297b-bf5c-4abe-853f-8562201ec721","Type":"ContainerStarted","Data":"b02a8716f5b04bd90897623157892369592a7c652ec160bdf91775c5080f302b"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.558847 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" event={"ID":"97d973de-bf67-44c5-b76c-dcb36cab65b4","Type":"ContainerStarted","Data":"90002132608ad5c1fc4ba464e5f374119b7642e0efa0479f3af8c268f9a0f799"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.559891 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" podStartSLOduration=3.3788039579999998 podStartE2EDuration="11.559878443s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.222744469 +0000 UTC m=+962.378036981" lastFinishedPulling="2025-10-06 15:09:33.403818954 +0000 UTC m=+970.559111466" observedRunningTime="2025-10-06 15:09:34.48658898 +0000 UTC m=+971.641881492" watchObservedRunningTime="2025-10-06 15:09:34.559878443 +0000 UTC m=+971.715170955" Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.560115 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" podStartSLOduration=2.921569705 podStartE2EDuration="11.560110278s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:24.755381397 +0000 UTC m=+961.910673909" lastFinishedPulling="2025-10-06 15:09:33.39392197 +0000 UTC m=+970.549214482" observedRunningTime="2025-10-06 15:09:34.556095563 +0000 UTC m=+971.711388075" watchObservedRunningTime="2025-10-06 15:09:34.560110278 +0000 UTC m=+971.715402790" Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.578383 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" podStartSLOduration=3.389585962 podStartE2EDuration="11.57836844s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.211224236 +0000 UTC m=+962.366516738" lastFinishedPulling="2025-10-06 15:09:33.400006704 +0000 UTC m=+970.555299216" observedRunningTime="2025-10-06 15:09:34.577276204 +0000 UTC m=+971.732568716" watchObservedRunningTime="2025-10-06 15:09:34.57836844 +0000 UTC m=+971.733660952" Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.579448 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" event={"ID":"4a94b2e1-c815-45d3-b7ac-aa7b632af0ef","Type":"ContainerStarted","Data":"96db1461a206c90fac4f57460f2779a8efe9dfb6d1da982a668e3fe14d5ba629"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.579480 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" event={"ID":"4a94b2e1-c815-45d3-b7ac-aa7b632af0ef","Type":"ContainerStarted","Data":"1e96d52bf1d87e41fceeef4b6b236f7351f6105a15305394fce021efaf16d02a"} Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.580035 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" Oct 06 15:09:34 crc kubenswrapper[4763]: I1006 15:09:34.601217 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" podStartSLOduration=2.965445833 podStartE2EDuration="11.60119879s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:24.757350224 +0000 UTC m=+961.912642736" lastFinishedPulling="2025-10-06 15:09:33.393103181 +0000 UTC m=+970.548395693" observedRunningTime="2025-10-06 15:09:34.597985064 +0000 UTC m=+971.753277586" watchObservedRunningTime="2025-10-06 15:09:34.60119879 +0000 UTC m=+971.756491302" Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.126472 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-847bc59d9d-vvb5x" Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.593416 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" event={"ID":"4afc1cb2-2dde-439a-88ea-8aaeedbadc53","Type":"ContainerStarted","Data":"c0141fed19b21bf14f0f78d443389a5a47de570f407e841f99edb6292b5ca189"} Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.596183 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" event={"ID":"dd4eb024-6f04-4ee5-a485-78311ddee488","Type":"ContainerStarted","Data":"ba91dbbd53d7bfdfbd21d2d1577c53aca8ff42ca7e0e314cdd5fc515c53d9242"} Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.596504 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.599044 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" event={"ID":"64a78de0-4ad4-40f2-bf1c-a830f4c32dd1","Type":"ContainerStarted","Data":"1f019d1f0e4caa330aad97b0624864a05330204681f99d5d181f748cb73ee1f2"} Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.601827 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" event={"ID":"0c3ffc96-39cf-4c8d-9b14-90293eabb117","Type":"ContainerStarted","Data":"b4fb21352b4d688a0aa8cfc419d0380d5b417172a3192d3365e23ac9a676783a"} Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.607037 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" event={"ID":"d7407f54-5a8a-4aa2-a65c-5cba9feedd94","Type":"ContainerStarted","Data":"0c4613d0e2f5adf3a2aaa68ca6d4b7450dcd56303e8f6198dceb92249461090f"} Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.610333 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" event={"ID":"af8be53e-818e-4c1e-a90c-800b02a679e4","Type":"ContainerStarted","Data":"12a896cc67922b98e922e8109fd671cbc8f93af0b5ebb2b51576efea4faddc9a"} Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.615165 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" event={"ID":"2c0fa69d-0c27-48bc-b120-e1416445de40","Type":"ContainerStarted","Data":"17a45430edacc757da88aab1fb6f7b0167d5749a60eca4a190b01de5e02bf1b5"} Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.624090 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" podStartSLOduration=3.636960813 podStartE2EDuration="12.624072908s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:24.404800317 +0000 UTC m=+961.560092829" lastFinishedPulling="2025-10-06 15:09:33.391912412 +0000 UTC m=+970.547204924" observedRunningTime="2025-10-06 15:09:35.618407274 +0000 UTC m=+972.773699806" watchObservedRunningTime="2025-10-06 15:09:35.624072908 +0000 UTC m=+972.779365430" Oct 06 15:09:35 crc kubenswrapper[4763]: I1006 15:09:35.650058 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" podStartSLOduration=4.025774167 podStartE2EDuration="12.650029652s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:24.804207922 +0000 UTC m=+961.959500434" lastFinishedPulling="2025-10-06 15:09:33.428463407 +0000 UTC m=+970.583755919" observedRunningTime="2025-10-06 15:09:35.64108099 +0000 UTC m=+972.796373562" watchObservedRunningTime="2025-10-06 15:09:35.650029652 +0000 UTC m=+972.805322204" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.625327 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" event={"ID":"e5c474ec-88f8-4c60-a866-fcc48cf5bcec","Type":"ContainerStarted","Data":"9546038d93a5c2237658f9fc7b2e4c4895eb956ed8dea7d7bf0505d40d18df77"} Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.625425 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.628194 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" event={"ID":"5f6171a0-9860-4ea2-b850-dc4a67f25499","Type":"ContainerStarted","Data":"f6833a4c0593c323527bfd0a2199d4d7d11eea0bc5e650e298ba7cbec6d9f875"} Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.628363 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.631137 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" event={"ID":"97d973de-bf67-44c5-b76c-dcb36cab65b4","Type":"ContainerStarted","Data":"3fd92016ba5c9dfcfa8f740898b91ce7574744f8e99a6021f5b3e61ceb97837d"} Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.631801 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.633352 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" event={"ID":"f76d9cdd-3955-4727-87b4-20bf248be0f2","Type":"ContainerStarted","Data":"3f45e18d3ac78e44f0d4a0aa65972c632dc26ef880a7d632ed72c14566322d9d"} Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.633868 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.635735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" event={"ID":"96be297b-bf5c-4abe-853f-8562201ec721","Type":"ContainerStarted","Data":"80778ee2fededc836ff8b367282f5f1cb684186280b24595ac0e095504e3383c"} Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.636056 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.638299 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" event={"ID":"a0c790cf-619e-4986-8dbb-f9ed71be08c7","Type":"ContainerStarted","Data":"d05f9b9ad67ba0ff2bfef133c6097f23a5fc47a15e1d2fe39ed06ae819727640"} Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.639809 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.639855 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.652886 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" podStartSLOduration=4.841223061 podStartE2EDuration="13.652868887s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:24.620566859 +0000 UTC m=+961.775859371" lastFinishedPulling="2025-10-06 15:09:33.432212685 +0000 UTC m=+970.587505197" observedRunningTime="2025-10-06 15:09:36.642156003 +0000 UTC m=+973.797448545" watchObservedRunningTime="2025-10-06 15:09:36.652868887 +0000 UTC m=+973.808161409" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.664847 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" podStartSLOduration=4.9792136849999995 podStartE2EDuration="13.66482732s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:24.689841288 +0000 UTC m=+961.845133800" lastFinishedPulling="2025-10-06 15:09:33.375454923 +0000 UTC m=+970.530747435" observedRunningTime="2025-10-06 15:09:36.66017399 +0000 UTC m=+973.815466512" watchObservedRunningTime="2025-10-06 15:09:36.66482732 +0000 UTC m=+973.820119832" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.690672 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" podStartSLOduration=5.486914168 podStartE2EDuration="13.69065038s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.199470948 +0000 UTC m=+962.354763460" lastFinishedPulling="2025-10-06 15:09:33.40320716 +0000 UTC m=+970.558499672" observedRunningTime="2025-10-06 15:09:36.686177585 +0000 UTC m=+973.841470107" watchObservedRunningTime="2025-10-06 15:09:36.69065038 +0000 UTC m=+973.845942892" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.712179 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" podStartSLOduration=5.5070936360000005 podStartE2EDuration="13.712160519s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.194689405 +0000 UTC m=+962.349981917" lastFinishedPulling="2025-10-06 15:09:33.399756288 +0000 UTC m=+970.555048800" observedRunningTime="2025-10-06 15:09:36.706212938 +0000 UTC m=+973.861505470" watchObservedRunningTime="2025-10-06 15:09:36.712160519 +0000 UTC m=+973.867453031" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.723837 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" podStartSLOduration=5.093543257 podStartE2EDuration="13.723798204s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:24.756138155 +0000 UTC m=+961.911430667" lastFinishedPulling="2025-10-06 15:09:33.386393102 +0000 UTC m=+970.541685614" observedRunningTime="2025-10-06 15:09:36.721853658 +0000 UTC m=+973.877146170" watchObservedRunningTime="2025-10-06 15:09:36.723798204 +0000 UTC m=+973.879090716" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.742180 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" podStartSLOduration=5.274434444 podStartE2EDuration="13.742159508s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:24.964814129 +0000 UTC m=+962.120106641" lastFinishedPulling="2025-10-06 15:09:33.432539193 +0000 UTC m=+970.587831705" observedRunningTime="2025-10-06 15:09:36.736565426 +0000 UTC m=+973.891857948" watchObservedRunningTime="2025-10-06 15:09:36.742159508 +0000 UTC m=+973.897452030" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.773906 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" podStartSLOduration=5.193216515 podStartE2EDuration="13.773877608s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:24.819871643 +0000 UTC m=+961.975164155" lastFinishedPulling="2025-10-06 15:09:33.400532736 +0000 UTC m=+970.555825248" observedRunningTime="2025-10-06 15:09:36.763425391 +0000 UTC m=+973.918717903" watchObservedRunningTime="2025-10-06 15:09:36.773877608 +0000 UTC m=+973.929170150" Oct 06 15:09:36 crc kubenswrapper[4763]: I1006 15:09:36.784190 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" podStartSLOduration=5.596742926 podStartE2EDuration="13.784174552s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.205672665 +0000 UTC m=+962.360965177" lastFinishedPulling="2025-10-06 15:09:33.393104291 +0000 UTC m=+970.548396803" observedRunningTime="2025-10-06 15:09:36.780007303 +0000 UTC m=+973.935299815" watchObservedRunningTime="2025-10-06 15:09:36.784174552 +0000 UTC m=+973.939467064" Oct 06 15:09:37 crc kubenswrapper[4763]: I1006 15:09:37.647384 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" Oct 06 15:09:38 crc kubenswrapper[4763]: I1006 15:09:38.578673 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 15:09:38 crc kubenswrapper[4763]: I1006 15:09:38.660754 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-ctdbs" Oct 06 15:09:38 crc kubenswrapper[4763]: I1006 15:09:38.661863 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-m4l5b" Oct 06 15:09:38 crc kubenswrapper[4763]: I1006 15:09:38.661969 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-649675d675-x7zlp" Oct 06 15:09:41 crc kubenswrapper[4763]: I1006 15:09:41.682243 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" event={"ID":"be5ee6ad-dda0-4cb8-8c8a-e8051083873a","Type":"ContainerStarted","Data":"15cdcd6f48448a2e32d0537fa662b7863034f504af3e70cbf094839d4547b030"} Oct 06 15:09:41 crc kubenswrapper[4763]: I1006 15:09:41.682874 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:09:41 crc kubenswrapper[4763]: I1006 15:09:41.712422 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" podStartSLOduration=3.178821178 podStartE2EDuration="18.712399034s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.366857467 +0000 UTC m=+962.522149979" lastFinishedPulling="2025-10-06 15:09:40.900435313 +0000 UTC m=+978.055727835" observedRunningTime="2025-10-06 15:09:41.705683785 +0000 UTC m=+978.860976317" watchObservedRunningTime="2025-10-06 15:09:41.712399034 +0000 UTC m=+978.867691556" Oct 06 15:09:43 crc kubenswrapper[4763]: I1006 15:09:43.679541 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-ldl9r" Oct 06 15:09:43 crc kubenswrapper[4763]: I1006 15:09:43.777322 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2k5df" Oct 06 15:09:43 crc kubenswrapper[4763]: I1006 15:09:43.794306 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-cwcd8" Oct 06 15:09:43 crc kubenswrapper[4763]: I1006 15:09:43.841721 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-pc484" Oct 06 15:09:43 crc kubenswrapper[4763]: I1006 15:09:43.850949 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-s54qg" Oct 06 15:09:43 crc kubenswrapper[4763]: I1006 15:09:43.986765 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" Oct 06 15:09:43 crc kubenswrapper[4763]: I1006 15:09:43.987259 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-lwhfc" Oct 06 15:09:43 crc kubenswrapper[4763]: I1006 15:09:43.989344 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-7hqwv" Oct 06 15:09:44 crc kubenswrapper[4763]: I1006 15:09:44.065318 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd" Oct 06 15:09:44 crc kubenswrapper[4763]: I1006 15:09:44.093878 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-dp6zq" Oct 06 15:09:44 crc kubenswrapper[4763]: I1006 15:09:44.309560 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-zlwcn" Oct 06 15:09:44 crc kubenswrapper[4763]: I1006 15:09:44.323794 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-hbvnb" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.727654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" event={"ID":"7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f","Type":"ContainerStarted","Data":"a565b2259e33a67718b1307a6387cebeb0b43d8fcd255188e7df37e3a38a3f24"} Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.728461 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.729517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" event={"ID":"948b7d9f-215b-47d3-b8dc-a953d26e9041","Type":"ContainerStarted","Data":"2f2591f19ad64be77fca73b4156a205ad2f61ec4dcd74ef5d21715dd04a13ded"} Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.729713 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.730908 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" event={"ID":"c48055a7-5392-47b3-a30c-f2f97c8463cc","Type":"ContainerStarted","Data":"450ed5fd98d0afa77fd0c90a3eb5de1d25d6f6a9775ef735ab5a24ad7e33f827"} Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.733030 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" event={"ID":"1e501127-6403-4475-896e-52efee97894e","Type":"ContainerStarted","Data":"de97ec2b34df0158d1226e8d6431cdc3e3b1bd55121ad81eaf54260074f2a97a"} Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.733437 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.735631 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" event={"ID":"216ab588-0962-451e-8481-2dff1de05a59","Type":"ContainerStarted","Data":"915453f3c6cf200d61cc30a8ab57a8f779129ca353ac5c94ac8c6c726cc9288a"} Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.735801 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.737217 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" event={"ID":"1e2641ba-13ac-4406-a4a7-79d2a4e7bafd","Type":"ContainerStarted","Data":"baedd395edca8c65244d80750a1ee265b170abf124002450dc97c085109998ad"} Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.737389 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.739145 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" event={"ID":"b754fcb6-88a9-42a9-9cfc-e624fe6d1afa","Type":"ContainerStarted","Data":"336e09665ce9c6b2acfc08bf54c62f55e6cba42cb6316f09c44df91616120292"} Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.739280 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.748023 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" podStartSLOduration=2.783573401 podStartE2EDuration="23.748009585s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.316069656 +0000 UTC m=+962.471362168" lastFinishedPulling="2025-10-06 15:09:46.28050584 +0000 UTC m=+983.435798352" observedRunningTime="2025-10-06 15:09:46.746791866 +0000 UTC m=+983.902084388" watchObservedRunningTime="2025-10-06 15:09:46.748009585 +0000 UTC m=+983.903302097" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.765463 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" podStartSLOduration=3.655123121 podStartE2EDuration="23.765444478s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.331769057 +0000 UTC m=+962.487061569" lastFinishedPulling="2025-10-06 15:09:45.442090414 +0000 UTC m=+982.597382926" observedRunningTime="2025-10-06 15:09:46.760109971 +0000 UTC m=+983.915402493" watchObservedRunningTime="2025-10-06 15:09:46.765444478 +0000 UTC m=+983.920736990" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.775742 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-69cq6" podStartSLOduration=1.8511549189999998 podStartE2EDuration="22.775718111s" podCreationTimestamp="2025-10-06 15:09:24 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.346567277 +0000 UTC m=+962.501859789" lastFinishedPulling="2025-10-06 15:09:46.271130469 +0000 UTC m=+983.426422981" observedRunningTime="2025-10-06 15:09:46.772164217 +0000 UTC m=+983.927456749" watchObservedRunningTime="2025-10-06 15:09:46.775718111 +0000 UTC m=+983.931010623" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.818307 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" podStartSLOduration=2.894506524 podStartE2EDuration="23.818289177s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.347367236 +0000 UTC m=+962.502659748" lastFinishedPulling="2025-10-06 15:09:46.271149849 +0000 UTC m=+983.426442401" observedRunningTime="2025-10-06 15:09:46.800858165 +0000 UTC m=+983.956150677" watchObservedRunningTime="2025-10-06 15:09:46.818289177 +0000 UTC m=+983.973581699" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.820900 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" podStartSLOduration=2.899561393 podStartE2EDuration="23.820886129s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.331784957 +0000 UTC m=+962.487077469" lastFinishedPulling="2025-10-06 15:09:46.253109693 +0000 UTC m=+983.408402205" observedRunningTime="2025-10-06 15:09:46.816384382 +0000 UTC m=+983.971676924" watchObservedRunningTime="2025-10-06 15:09:46.820886129 +0000 UTC m=+983.976178661" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.842682 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" podStartSLOduration=2.7730230110000003 podStartE2EDuration="23.842663744s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.21984823 +0000 UTC m=+962.375140742" lastFinishedPulling="2025-10-06 15:09:46.289488963 +0000 UTC m=+983.444781475" observedRunningTime="2025-10-06 15:09:46.840011781 +0000 UTC m=+983.995304303" watchObservedRunningTime="2025-10-06 15:09:46.842663744 +0000 UTC m=+983.997956276" Oct 06 15:09:46 crc kubenswrapper[4763]: I1006 15:09:46.864840 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" podStartSLOduration=2.799247692 podStartE2EDuration="23.864819318s" podCreationTimestamp="2025-10-06 15:09:23 +0000 UTC" firstStartedPulling="2025-10-06 15:09:25.223095787 +0000 UTC m=+962.378388299" lastFinishedPulling="2025-10-06 15:09:46.288667413 +0000 UTC m=+983.443959925" observedRunningTime="2025-10-06 15:09:46.858952239 +0000 UTC m=+984.014244761" watchObservedRunningTime="2025-10-06 15:09:46.864819318 +0000 UTC m=+984.020111830" Oct 06 15:09:54 crc kubenswrapper[4763]: I1006 15:09:54.221786 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-th4tv" Oct 06 15:09:54 crc kubenswrapper[4763]: I1006 15:09:54.348504 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-brpsw" Oct 06 15:09:54 crc kubenswrapper[4763]: I1006 15:09:54.422663 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nnrgf" Oct 06 15:09:54 crc kubenswrapper[4763]: I1006 15:09:54.457403 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-w99ph" Oct 06 15:09:54 crc kubenswrapper[4763]: I1006 15:09:54.464584 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-625x7" Oct 06 15:09:54 crc kubenswrapper[4763]: I1006 15:09:54.495845 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-jsb9x" Oct 06 15:09:54 crc kubenswrapper[4763]: I1006 15:09:54.592928 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn" Oct 06 15:10:03 crc kubenswrapper[4763]: I1006 15:10:03.876687 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:10:03 crc kubenswrapper[4763]: I1006 15:10:03.877529 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.295952 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2p55t"] Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.297792 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.300227 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.300279 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.302101 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-tdsks" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.302102 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.309253 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2p55t"] Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.361113 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ctt95"] Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.362240 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.366319 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.373711 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ctt95"] Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.418320 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b087804-1c0f-432d-b3e4-44a41e4a9757-config\") pod \"dnsmasq-dns-675f4bcbfc-2p55t\" (UID: \"2b087804-1c0f-432d-b3e4-44a41e4a9757\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.418395 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcgts\" (UniqueName: \"kubernetes.io/projected/2b087804-1c0f-432d-b3e4-44a41e4a9757-kube-api-access-lcgts\") pod \"dnsmasq-dns-675f4bcbfc-2p55t\" (UID: \"2b087804-1c0f-432d-b3e4-44a41e4a9757\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.520036 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-config\") pod \"dnsmasq-dns-78dd6ddcc-ctt95\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.520086 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ctt95\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.520241 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk4qx\" (UniqueName: \"kubernetes.io/projected/5736922b-33dd-4ac5-8c9b-d199c72b5c78-kube-api-access-tk4qx\") pod \"dnsmasq-dns-78dd6ddcc-ctt95\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.520303 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b087804-1c0f-432d-b3e4-44a41e4a9757-config\") pod \"dnsmasq-dns-675f4bcbfc-2p55t\" (UID: \"2b087804-1c0f-432d-b3e4-44a41e4a9757\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.520418 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcgts\" (UniqueName: \"kubernetes.io/projected/2b087804-1c0f-432d-b3e4-44a41e4a9757-kube-api-access-lcgts\") pod \"dnsmasq-dns-675f4bcbfc-2p55t\" (UID: \"2b087804-1c0f-432d-b3e4-44a41e4a9757\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.521137 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b087804-1c0f-432d-b3e4-44a41e4a9757-config\") pod \"dnsmasq-dns-675f4bcbfc-2p55t\" (UID: \"2b087804-1c0f-432d-b3e4-44a41e4a9757\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.547913 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcgts\" (UniqueName: \"kubernetes.io/projected/2b087804-1c0f-432d-b3e4-44a41e4a9757-kube-api-access-lcgts\") pod \"dnsmasq-dns-675f4bcbfc-2p55t\" (UID: \"2b087804-1c0f-432d-b3e4-44a41e4a9757\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.621813 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.622252 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-config\") pod \"dnsmasq-dns-78dd6ddcc-ctt95\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.622322 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ctt95\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.622374 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk4qx\" (UniqueName: \"kubernetes.io/projected/5736922b-33dd-4ac5-8c9b-d199c72b5c78-kube-api-access-tk4qx\") pod \"dnsmasq-dns-78dd6ddcc-ctt95\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.623104 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-config\") pod \"dnsmasq-dns-78dd6ddcc-ctt95\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.623255 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ctt95\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.638815 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk4qx\" (UniqueName: \"kubernetes.io/projected/5736922b-33dd-4ac5-8c9b-d199c72b5c78-kube-api-access-tk4qx\") pod \"dnsmasq-dns-78dd6ddcc-ctt95\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:10 crc kubenswrapper[4763]: I1006 15:10:10.678793 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:11 crc kubenswrapper[4763]: I1006 15:10:11.029046 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ctt95"] Oct 06 15:10:11 crc kubenswrapper[4763]: W1006 15:10:11.032249 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5736922b_33dd_4ac5_8c9b_d199c72b5c78.slice/crio-c4ed771f55ba622bdca4f3c918d4c691658d773681c152e43c9da5a0e2cdd93d WatchSource:0}: Error finding container c4ed771f55ba622bdca4f3c918d4c691658d773681c152e43c9da5a0e2cdd93d: Status 404 returned error can't find the container with id c4ed771f55ba622bdca4f3c918d4c691658d773681c152e43c9da5a0e2cdd93d Oct 06 15:10:11 crc kubenswrapper[4763]: I1006 15:10:11.172547 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2p55t"] Oct 06 15:10:11 crc kubenswrapper[4763]: W1006 15:10:11.183284 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b087804_1c0f_432d_b3e4_44a41e4a9757.slice/crio-d2a158e6e9d976f59cd3c025e9d8560f60879dae29351fad425226189a0f5ee7 WatchSource:0}: Error finding container d2a158e6e9d976f59cd3c025e9d8560f60879dae29351fad425226189a0f5ee7: Status 404 returned error can't find the container with id d2a158e6e9d976f59cd3c025e9d8560f60879dae29351fad425226189a0f5ee7 Oct 06 15:10:11 crc kubenswrapper[4763]: I1006 15:10:11.984603 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" event={"ID":"2b087804-1c0f-432d-b3e4-44a41e4a9757","Type":"ContainerStarted","Data":"d2a158e6e9d976f59cd3c025e9d8560f60879dae29351fad425226189a0f5ee7"} Oct 06 15:10:11 crc kubenswrapper[4763]: I1006 15:10:11.986917 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" event={"ID":"5736922b-33dd-4ac5-8c9b-d199c72b5c78","Type":"ContainerStarted","Data":"c4ed771f55ba622bdca4f3c918d4c691658d773681c152e43c9da5a0e2cdd93d"} Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.042064 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2p55t"] Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.070372 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-q287r"] Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.071501 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.088400 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-q287r"] Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.167532 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-q287r\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.167600 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-config\") pod \"dnsmasq-dns-5ccc8479f9-q287r\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.167635 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp8tc\" (UniqueName: \"kubernetes.io/projected/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-kube-api-access-vp8tc\") pod \"dnsmasq-dns-5ccc8479f9-q287r\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.272455 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-q287r\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.272538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-config\") pod \"dnsmasq-dns-5ccc8479f9-q287r\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.272567 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp8tc\" (UniqueName: \"kubernetes.io/projected/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-kube-api-access-vp8tc\") pod \"dnsmasq-dns-5ccc8479f9-q287r\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.273519 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-q287r\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.273526 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-config\") pod \"dnsmasq-dns-5ccc8479f9-q287r\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.317814 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp8tc\" (UniqueName: \"kubernetes.io/projected/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-kube-api-access-vp8tc\") pod \"dnsmasq-dns-5ccc8479f9-q287r\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.331080 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ctt95"] Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.353204 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t8zrr"] Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.354415 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.384972 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t8zrr"] Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.395096 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.481463 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t8zrr\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.481571 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-config\") pod \"dnsmasq-dns-57d769cc4f-t8zrr\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.481700 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvlxx\" (UniqueName: \"kubernetes.io/projected/ab996378-985c-4fa8-bcc8-1eccde288a1e-kube-api-access-dvlxx\") pod \"dnsmasq-dns-57d769cc4f-t8zrr\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.583491 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-config\") pod \"dnsmasq-dns-57d769cc4f-t8zrr\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.585234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvlxx\" (UniqueName: \"kubernetes.io/projected/ab996378-985c-4fa8-bcc8-1eccde288a1e-kube-api-access-dvlxx\") pod \"dnsmasq-dns-57d769cc4f-t8zrr\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.585289 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t8zrr\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.595572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-config\") pod \"dnsmasq-dns-57d769cc4f-t8zrr\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.596438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t8zrr\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.654245 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvlxx\" (UniqueName: \"kubernetes.io/projected/ab996378-985c-4fa8-bcc8-1eccde288a1e-kube-api-access-dvlxx\") pod \"dnsmasq-dns-57d769cc4f-t8zrr\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.704105 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.944280 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-q287r"] Oct 06 15:10:13 crc kubenswrapper[4763]: W1006 15:10:13.955233 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod024fe1c7_e7a0_4ae6_a8ea_63f51fcb9f31.slice/crio-8dbd22d090e37630cfc3d0f9cd102643cf408f7cf533e7560ae3fd85a10075bb WatchSource:0}: Error finding container 8dbd22d090e37630cfc3d0f9cd102643cf408f7cf533e7560ae3fd85a10075bb: Status 404 returned error can't find the container with id 8dbd22d090e37630cfc3d0f9cd102643cf408f7cf533e7560ae3fd85a10075bb Oct 06 15:10:13 crc kubenswrapper[4763]: I1006 15:10:13.973208 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t8zrr"] Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.006696 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" event={"ID":"ab996378-985c-4fa8-bcc8-1eccde288a1e","Type":"ContainerStarted","Data":"9881b380b5b3d2fedf559ca6d737f14f0ab871aabb8275ad3bba15c3fc2b3b7c"} Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.011149 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" event={"ID":"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31","Type":"ContainerStarted","Data":"8dbd22d090e37630cfc3d0f9cd102643cf408f7cf533e7560ae3fd85a10075bb"} Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.212523 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.220506 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.223152 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d7c8n" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.223372 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.223499 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.223516 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.226994 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.227112 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.227693 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.227983 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.308800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.309027 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.309056 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.309073 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.309088 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjt25\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-kube-api-access-tjt25\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.309132 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.309156 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.309181 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.309196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.309215 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.309230 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410589 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410699 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410725 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410749 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410766 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjt25\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-kube-api-access-tjt25\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410783 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410838 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410867 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410894 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410910 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.410932 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.411342 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.411674 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.411686 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.412199 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.412482 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.413289 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.414962 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.415718 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.416375 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.416862 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.425726 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjt25\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-kube-api-access-tjt25\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.430859 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.467706 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.469693 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.472607 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.473016 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.473073 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gxlsk" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.473932 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.474007 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.474040 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.474490 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.477993 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.553583 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.612453 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.612499 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.612526 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.612546 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.612768 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.613992 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.614171 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.614261 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d4ht\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-kube-api-access-2d4ht\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.614325 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.614366 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.614412 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716410 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716458 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d4ht\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-kube-api-access-2d4ht\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716477 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716498 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716532 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716561 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716603 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716635 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716680 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.716693 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.717512 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.718074 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.717953 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.718124 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.718212 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.718196 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.721607 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.722180 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.724499 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.732684 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d4ht\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-kube-api-access-2d4ht\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.743158 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.743433 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " pod="openstack/rabbitmq-server-0" Oct 06 15:10:14 crc kubenswrapper[4763]: I1006 15:10:14.804053 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 15:10:15 crc kubenswrapper[4763]: I1006 15:10:15.974418 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 06 15:10:15 crc kubenswrapper[4763]: I1006 15:10:15.977562 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 15:10:15 crc kubenswrapper[4763]: I1006 15:10:15.980067 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 06 15:10:15 crc kubenswrapper[4763]: I1006 15:10:15.981805 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 06 15:10:15 crc kubenswrapper[4763]: I1006 15:10:15.981951 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 06 15:10:15 crc kubenswrapper[4763]: I1006 15:10:15.983348 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 06 15:10:15 crc kubenswrapper[4763]: I1006 15:10:15.984170 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-h6pn8" Oct 06 15:10:15 crc kubenswrapper[4763]: I1006 15:10:15.994501 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 06 15:10:15 crc kubenswrapper[4763]: I1006 15:10:15.995911 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.036140 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.036196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-default\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.036257 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.036311 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-secrets\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.036336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kolla-config\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.036400 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.036432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7hfw\" (UniqueName: \"kubernetes.io/projected/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kube-api-access-d7hfw\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.036493 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.036516 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.137891 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-secrets\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.137930 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kolla-config\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.137958 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.137988 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7hfw\" (UniqueName: \"kubernetes.io/projected/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kube-api-access-d7hfw\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.138015 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.138029 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.138075 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.138110 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-default\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.138128 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.138304 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.140056 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kolla-config\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.140651 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.141701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.142004 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-default\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.144928 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.145882 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-secrets\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.155912 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7hfw\" (UniqueName: \"kubernetes.io/projected/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kube-api-access-d7hfw\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.156147 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.170229 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " pod="openstack/openstack-galera-0" Oct 06 15:10:16 crc kubenswrapper[4763]: I1006 15:10:16.302576 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.311312 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.313236 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.316744 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.318262 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.318701 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.318905 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rnmhc" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.323071 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.355151 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.355265 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzvjn\" (UniqueName: \"kubernetes.io/projected/42d3e722-26a6-40fa-9762-7da59b0009b7-kube-api-access-qzvjn\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.355398 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.355447 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.355533 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.355578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.355603 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.355669 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.355727 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.457787 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.457899 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzvjn\" (UniqueName: \"kubernetes.io/projected/42d3e722-26a6-40fa-9762-7da59b0009b7-kube-api-access-qzvjn\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.457937 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.457990 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.458042 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.459169 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.459204 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.459877 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.460408 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.458199 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.461590 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.461655 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.462386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.464037 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.465383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.471423 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.471971 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.478633 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzvjn\" (UniqueName: \"kubernetes.io/projected/42d3e722-26a6-40fa-9762-7da59b0009b7-kube-api-access-qzvjn\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.484694 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.636897 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.730327 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.738373 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.743041 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.749657 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-m84x2" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.749877 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.770836 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4w6c\" (UniqueName: \"kubernetes.io/projected/95acc4bd-d14c-4204-b20e-36085edffb73-kube-api-access-h4w6c\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.770977 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-memcached-tls-certs\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.771088 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-config-data\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.771117 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-combined-ca-bundle\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.771186 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-kolla-config\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.778966 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.873116 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-memcached-tls-certs\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.873216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-config-data\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.873243 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-combined-ca-bundle\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.873313 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-kolla-config\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.873339 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4w6c\" (UniqueName: \"kubernetes.io/projected/95acc4bd-d14c-4204-b20e-36085edffb73-kube-api-access-h4w6c\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.874661 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-kolla-config\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.874782 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-config-data\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.879357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-memcached-tls-certs\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.879361 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-combined-ca-bundle\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:17 crc kubenswrapper[4763]: I1006 15:10:17.887550 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4w6c\" (UniqueName: \"kubernetes.io/projected/95acc4bd-d14c-4204-b20e-36085edffb73-kube-api-access-h4w6c\") pod \"memcached-0\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " pod="openstack/memcached-0" Oct 06 15:10:18 crc kubenswrapper[4763]: I1006 15:10:18.066841 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 15:10:19 crc kubenswrapper[4763]: I1006 15:10:19.352326 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:10:19 crc kubenswrapper[4763]: I1006 15:10:19.353306 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 15:10:19 crc kubenswrapper[4763]: I1006 15:10:19.356214 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-d9vd9" Oct 06 15:10:19 crc kubenswrapper[4763]: I1006 15:10:19.365866 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:10:19 crc kubenswrapper[4763]: I1006 15:10:19.398494 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxlxc\" (UniqueName: \"kubernetes.io/projected/2b82aad8-d8ea-4e3f-9b7f-973232a5ffac-kube-api-access-bxlxc\") pod \"kube-state-metrics-0\" (UID: \"2b82aad8-d8ea-4e3f-9b7f-973232a5ffac\") " pod="openstack/kube-state-metrics-0" Oct 06 15:10:19 crc kubenswrapper[4763]: I1006 15:10:19.499638 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxlxc\" (UniqueName: \"kubernetes.io/projected/2b82aad8-d8ea-4e3f-9b7f-973232a5ffac-kube-api-access-bxlxc\") pod \"kube-state-metrics-0\" (UID: \"2b82aad8-d8ea-4e3f-9b7f-973232a5ffac\") " pod="openstack/kube-state-metrics-0" Oct 06 15:10:19 crc kubenswrapper[4763]: I1006 15:10:19.517435 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxlxc\" (UniqueName: \"kubernetes.io/projected/2b82aad8-d8ea-4e3f-9b7f-973232a5ffac-kube-api-access-bxlxc\") pod \"kube-state-metrics-0\" (UID: \"2b82aad8-d8ea-4e3f-9b7f-973232a5ffac\") " pod="openstack/kube-state-metrics-0" Oct 06 15:10:19 crc kubenswrapper[4763]: I1006 15:10:19.671120 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.811006 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lw4hs"] Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.812386 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.818175 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.818185 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.859254 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-lz7vf" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.859997 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lw4hs"] Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.872085 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-combined-ca-bundle\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.872136 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjgkv\" (UniqueName: \"kubernetes.io/projected/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-kube-api-access-wjgkv\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.872159 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-log-ovn\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.872176 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-ovn-controller-tls-certs\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.872197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run-ovn\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.872221 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-scripts\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.872241 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.886175 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cf4dn"] Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.888129 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.898003 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cf4dn"] Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974358 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run-ovn\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974413 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d14df013-8cb0-4f11-b69d-a52002788320-scripts\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974460 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-scripts\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974491 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974539 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-etc-ovs\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974642 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-run\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974667 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dznqd\" (UniqueName: \"kubernetes.io/projected/d14df013-8cb0-4f11-b69d-a52002788320-kube-api-access-dznqd\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974719 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-log\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974750 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-combined-ca-bundle\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974785 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjgkv\" (UniqueName: \"kubernetes.io/projected/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-kube-api-access-wjgkv\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974851 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-log-ovn\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974873 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-ovn-controller-tls-certs\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.974900 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-lib\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.976364 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run-ovn\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.977550 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-log-ovn\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.977739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.978583 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-scripts\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.981033 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-ovn-controller-tls-certs\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.981168 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-combined-ca-bundle\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:23 crc kubenswrapper[4763]: I1006 15:10:23.994451 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjgkv\" (UniqueName: \"kubernetes.io/projected/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-kube-api-access-wjgkv\") pod \"ovn-controller-lw4hs\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.075952 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-log\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.076040 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-lib\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.076063 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d14df013-8cb0-4f11-b69d-a52002788320-scripts\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.076133 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-etc-ovs\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.076166 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dznqd\" (UniqueName: \"kubernetes.io/projected/d14df013-8cb0-4f11-b69d-a52002788320-kube-api-access-dznqd\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.076188 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-run\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.076257 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-log\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.076304 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-run\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.076502 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-lib\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.076578 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-etc-ovs\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.078746 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d14df013-8cb0-4f11-b69d-a52002788320-scripts\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.085546 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.087063 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.089575 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.089949 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.090203 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.090372 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.090479 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8hffh" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.095511 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dznqd\" (UniqueName: \"kubernetes.io/projected/d14df013-8cb0-4f11-b69d-a52002788320-kube-api-access-dznqd\") pod \"ovn-controller-ovs-cf4dn\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.109727 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.172678 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.177543 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.177587 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7c4z\" (UniqueName: \"kubernetes.io/projected/1bcb31ee-4374-46aa-ab52-39f216f2bf67-kube-api-access-j7c4z\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.177698 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.177725 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-config\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.177749 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.177775 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.177869 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.178056 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.202145 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.279548 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.279904 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7c4z\" (UniqueName: \"kubernetes.io/projected/1bcb31ee-4374-46aa-ab52-39f216f2bf67-kube-api-access-j7c4z\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.279951 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.279972 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-config\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.279998 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.280025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.280058 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.280144 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.280504 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.280794 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.281432 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-config\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.281532 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.283926 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.285309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.298464 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.301306 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.302356 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7c4z\" (UniqueName: \"kubernetes.io/projected/1bcb31ee-4374-46aa-ab52-39f216f2bf67-kube-api-access-j7c4z\") pod \"ovsdbserver-nb-0\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:24 crc kubenswrapper[4763]: I1006 15:10:24.429690 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.857550 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.859251 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.861944 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-hk59t" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.862172 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.862381 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.862641 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.895048 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.917653 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b444a5f8-8311-488c-b612-2d44328edc52-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.917730 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.917800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.917831 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.917862 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.917888 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv8dw\" (UniqueName: \"kubernetes.io/projected/b444a5f8-8311-488c-b612-2d44328edc52-kube-api-access-wv8dw\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.917922 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-config\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:25 crc kubenswrapper[4763]: I1006 15:10:25.917957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.019557 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b444a5f8-8311-488c-b612-2d44328edc52-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.019647 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.019757 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.019783 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.019817 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.019844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv8dw\" (UniqueName: \"kubernetes.io/projected/b444a5f8-8311-488c-b612-2d44328edc52-kube-api-access-wv8dw\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.019882 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-config\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.019924 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.020135 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b444a5f8-8311-488c-b612-2d44328edc52-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.020993 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.021852 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.022401 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-config\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.026062 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.028074 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.028841 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.039477 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv8dw\" (UniqueName: \"kubernetes.io/projected/b444a5f8-8311-488c-b612-2d44328edc52-kube-api-access-wv8dw\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.048453 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: E1006 15:10:26.155112 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 15:10:26 crc kubenswrapper[4763]: E1006 15:10:26.155279 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tk4qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-ctt95_openstack(5736922b-33dd-4ac5-8c9b-d199c72b5c78): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 15:10:26 crc kubenswrapper[4763]: E1006 15:10:26.157522 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" podUID="5736922b-33dd-4ac5-8c9b-d199c72b5c78" Oct 06 15:10:26 crc kubenswrapper[4763]: E1006 15:10:26.171062 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 15:10:26 crc kubenswrapper[4763]: E1006 15:10:26.171226 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lcgts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-2p55t_openstack(2b087804-1c0f-432d-b3e4-44a41e4a9757): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 15:10:26 crc kubenswrapper[4763]: E1006 15:10:26.172575 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" podUID="2b087804-1c0f-432d-b3e4-44a41e4a9757" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.193398 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.549521 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 15:10:26 crc kubenswrapper[4763]: W1006 15:10:26.563171 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fad9bbe_33dc_4f1d_a156_52bbd3a69273.slice/crio-125782bda6f71dae1aae108fb18daaeda2d030e4707c1c6a8f51dee433b1b856 WatchSource:0}: Error finding container 125782bda6f71dae1aae108fb18daaeda2d030e4707c1c6a8f51dee433b1b856: Status 404 returned error can't find the container with id 125782bda6f71dae1aae108fb18daaeda2d030e4707c1c6a8f51dee433b1b856 Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.806439 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:10:26 crc kubenswrapper[4763]: W1006 15:10:26.810465 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b82aad8_d8ea_4e3f_9b7f_973232a5ffac.slice/crio-c616aa6561518f62d7511516d602dd87d23300e10e9d03e6cec6b42d7eb2ae74 WatchSource:0}: Error finding container c616aa6561518f62d7511516d602dd87d23300e10e9d03e6cec6b42d7eb2ae74: Status 404 returned error can't find the container with id c616aa6561518f62d7511516d602dd87d23300e10e9d03e6cec6b42d7eb2ae74 Oct 06 15:10:26 crc kubenswrapper[4763]: W1006 15:10:26.811893 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fd7fbde_cddf_41fe_9a6e_6b1cdba389de.slice/crio-4c1d21dd34b79f3b673ecb2ff7924594aa3a579f7110e79747f6c190a8b7f833 WatchSource:0}: Error finding container 4c1d21dd34b79f3b673ecb2ff7924594aa3a579f7110e79747f6c190a8b7f833: Status 404 returned error can't find the container with id 4c1d21dd34b79f3b673ecb2ff7924594aa3a579f7110e79747f6c190a8b7f833 Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.812269 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.817881 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 15:10:26 crc kubenswrapper[4763]: W1006 15:10:26.836768 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42d3e722_26a6_40fa_9762_7da59b0009b7.slice/crio-4bc4599d1c35e74df80d583142c8049f1b02bafc4828e8e2648a2835e96dd8ad WatchSource:0}: Error finding container 4bc4599d1c35e74df80d583142c8049f1b02bafc4828e8e2648a2835e96dd8ad: Status 404 returned error can't find the container with id 4bc4599d1c35e74df80d583142c8049f1b02bafc4828e8e2648a2835e96dd8ad Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.838399 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 15:10:26 crc kubenswrapper[4763]: W1006 15:10:26.842423 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c83a4de_f6df_4d0e_9bd0_03cbcb877f43.slice/crio-260709a27efd28981c554ca791871e2e0beeb8f26d1c602be4aa85acd764dd72 WatchSource:0}: Error finding container 260709a27efd28981c554ca791871e2e0beeb8f26d1c602be4aa85acd764dd72: Status 404 returned error can't find the container with id 260709a27efd28981c554ca791871e2e0beeb8f26d1c602be4aa85acd764dd72 Oct 06 15:10:26 crc kubenswrapper[4763]: I1006 15:10:26.852671 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.009005 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lw4hs"] Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.101257 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.200501 4763 generic.go:334] "Generic (PLEG): container finished" podID="024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" containerID="059180dfffb2f4ec5062d7c004b4bc2124526f5d57c5fd6b90a56b685d97f34b" exitCode=0 Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.200578 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" event={"ID":"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31","Type":"ContainerDied","Data":"059180dfffb2f4ec5062d7c004b4bc2124526f5d57c5fd6b90a56b685d97f34b"} Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.205284 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lw4hs" event={"ID":"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7","Type":"ContainerStarted","Data":"6a8fba37f509e3de46058858ddef56d825fa8ec520581906fca22fc7c3f26c4c"} Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.208304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1bcb31ee-4374-46aa-ab52-39f216f2bf67","Type":"ContainerStarted","Data":"5d8e840ce558cd8c9947c823e5582e6d99fb4564e364740acfedd1b0f30b9743"} Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.209223 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2fad9bbe-33dc-4f1d-a156-52bbd3a69273","Type":"ContainerStarted","Data":"125782bda6f71dae1aae108fb18daaeda2d030e4707c1c6a8f51dee433b1b856"} Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.211224 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43","Type":"ContainerStarted","Data":"260709a27efd28981c554ca791871e2e0beeb8f26d1c602be4aa85acd764dd72"} Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.216004 4763 generic.go:334] "Generic (PLEG): container finished" podID="ab996378-985c-4fa8-bcc8-1eccde288a1e" containerID="82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb" exitCode=0 Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.217156 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" event={"ID":"ab996378-985c-4fa8-bcc8-1eccde288a1e","Type":"ContainerDied","Data":"82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb"} Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.223684 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de","Type":"ContainerStarted","Data":"4c1d21dd34b79f3b673ecb2ff7924594aa3a579f7110e79747f6c190a8b7f833"} Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.227530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"95acc4bd-d14c-4204-b20e-36085edffb73","Type":"ContainerStarted","Data":"24f6f7e98ab943c7ffc938882db060f7a5c265b1fcdf183d0d6348f091468444"} Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.230290 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b82aad8-d8ea-4e3f-9b7f-973232a5ffac","Type":"ContainerStarted","Data":"c616aa6561518f62d7511516d602dd87d23300e10e9d03e6cec6b42d7eb2ae74"} Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.231838 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"42d3e722-26a6-40fa-9762-7da59b0009b7","Type":"ContainerStarted","Data":"4bc4599d1c35e74df80d583142c8049f1b02bafc4828e8e2648a2835e96dd8ad"} Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.356384 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 15:10:27 crc kubenswrapper[4763]: W1006 15:10:27.367377 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb444a5f8_8311_488c_b612_2d44328edc52.slice/crio-073c0c732f18861a2e2528365b60aece181bc2842d2501b76dd7f08e706ec232 WatchSource:0}: Error finding container 073c0c732f18861a2e2528365b60aece181bc2842d2501b76dd7f08e706ec232: Status 404 returned error can't find the container with id 073c0c732f18861a2e2528365b60aece181bc2842d2501b76dd7f08e706ec232 Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.751005 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.755490 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.864537 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cf4dn"] Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.903981 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-dns-svc\") pod \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.904048 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk4qx\" (UniqueName: \"kubernetes.io/projected/5736922b-33dd-4ac5-8c9b-d199c72b5c78-kube-api-access-tk4qx\") pod \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.904116 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b087804-1c0f-432d-b3e4-44a41e4a9757-config\") pod \"2b087804-1c0f-432d-b3e4-44a41e4a9757\" (UID: \"2b087804-1c0f-432d-b3e4-44a41e4a9757\") " Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.904164 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-config\") pod \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\" (UID: \"5736922b-33dd-4ac5-8c9b-d199c72b5c78\") " Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.904322 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcgts\" (UniqueName: \"kubernetes.io/projected/2b087804-1c0f-432d-b3e4-44a41e4a9757-kube-api-access-lcgts\") pod \"2b087804-1c0f-432d-b3e4-44a41e4a9757\" (UID: \"2b087804-1c0f-432d-b3e4-44a41e4a9757\") " Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.906277 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-config" (OuterVolumeSpecName: "config") pod "5736922b-33dd-4ac5-8c9b-d199c72b5c78" (UID: "5736922b-33dd-4ac5-8c9b-d199c72b5c78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.906343 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5736922b-33dd-4ac5-8c9b-d199c72b5c78" (UID: "5736922b-33dd-4ac5-8c9b-d199c72b5c78"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.906883 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b087804-1c0f-432d-b3e4-44a41e4a9757-config" (OuterVolumeSpecName: "config") pod "2b087804-1c0f-432d-b3e4-44a41e4a9757" (UID: "2b087804-1c0f-432d-b3e4-44a41e4a9757"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.910132 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b087804-1c0f-432d-b3e4-44a41e4a9757-kube-api-access-lcgts" (OuterVolumeSpecName: "kube-api-access-lcgts") pod "2b087804-1c0f-432d-b3e4-44a41e4a9757" (UID: "2b087804-1c0f-432d-b3e4-44a41e4a9757"). InnerVolumeSpecName "kube-api-access-lcgts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:10:27 crc kubenswrapper[4763]: I1006 15:10:27.910251 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5736922b-33dd-4ac5-8c9b-d199c72b5c78-kube-api-access-tk4qx" (OuterVolumeSpecName: "kube-api-access-tk4qx") pod "5736922b-33dd-4ac5-8c9b-d199c72b5c78" (UID: "5736922b-33dd-4ac5-8c9b-d199c72b5c78"). InnerVolumeSpecName "kube-api-access-tk4qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.006058 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.006110 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcgts\" (UniqueName: \"kubernetes.io/projected/2b087804-1c0f-432d-b3e4-44a41e4a9757-kube-api-access-lcgts\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.006133 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5736922b-33dd-4ac5-8c9b-d199c72b5c78-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.006147 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk4qx\" (UniqueName: \"kubernetes.io/projected/5736922b-33dd-4ac5-8c9b-d199c72b5c78-kube-api-access-tk4qx\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.006158 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b087804-1c0f-432d-b3e4-44a41e4a9757-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.240835 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cf4dn" event={"ID":"d14df013-8cb0-4f11-b69d-a52002788320","Type":"ContainerStarted","Data":"474c8ae0c0ae7a3e36727f0aaec390fcdb9eadd6fb0a89930415b9183f155250"} Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.242155 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.242188 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ctt95" event={"ID":"5736922b-33dd-4ac5-8c9b-d199c72b5c78","Type":"ContainerDied","Data":"c4ed771f55ba622bdca4f3c918d4c691658d773681c152e43c9da5a0e2cdd93d"} Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.245596 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" event={"ID":"ab996378-985c-4fa8-bcc8-1eccde288a1e","Type":"ContainerStarted","Data":"3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e"} Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.245761 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.249167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" event={"ID":"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31","Type":"ContainerStarted","Data":"fe138415dc947a471df08b86b03b53a815c137e140183219b500f293a4b6ab0b"} Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.249273 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.251004 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b444a5f8-8311-488c-b612-2d44328edc52","Type":"ContainerStarted","Data":"073c0c732f18861a2e2528365b60aece181bc2842d2501b76dd7f08e706ec232"} Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.252643 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" event={"ID":"2b087804-1c0f-432d-b3e4-44a41e4a9757","Type":"ContainerDied","Data":"d2a158e6e9d976f59cd3c025e9d8560f60879dae29351fad425226189a0f5ee7"} Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.252684 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2p55t" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.269470 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" podStartSLOduration=2.942191391 podStartE2EDuration="15.269453332s" podCreationTimestamp="2025-10-06 15:10:13 +0000 UTC" firstStartedPulling="2025-10-06 15:10:13.99662532 +0000 UTC m=+1011.151917832" lastFinishedPulling="2025-10-06 15:10:26.323887261 +0000 UTC m=+1023.479179773" observedRunningTime="2025-10-06 15:10:28.262702655 +0000 UTC m=+1025.417995167" watchObservedRunningTime="2025-10-06 15:10:28.269453332 +0000 UTC m=+1025.424745844" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.281077 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" podStartSLOduration=2.961428653 podStartE2EDuration="15.281060943s" podCreationTimestamp="2025-10-06 15:10:13 +0000 UTC" firstStartedPulling="2025-10-06 15:10:13.958728432 +0000 UTC m=+1011.114020944" lastFinishedPulling="2025-10-06 15:10:26.278360722 +0000 UTC m=+1023.433653234" observedRunningTime="2025-10-06 15:10:28.275954792 +0000 UTC m=+1025.431247304" watchObservedRunningTime="2025-10-06 15:10:28.281060943 +0000 UTC m=+1025.436353475" Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.319436 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ctt95"] Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.327415 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ctt95"] Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.343226 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2p55t"] Oct 06 15:10:28 crc kubenswrapper[4763]: I1006 15:10:28.350722 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2p55t"] Oct 06 15:10:29 crc kubenswrapper[4763]: I1006 15:10:29.583287 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b087804-1c0f-432d-b3e4-44a41e4a9757" path="/var/lib/kubelet/pods/2b087804-1c0f-432d-b3e4-44a41e4a9757/volumes" Oct 06 15:10:29 crc kubenswrapper[4763]: I1006 15:10:29.583675 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5736922b-33dd-4ac5-8c9b-d199c72b5c78" path="/var/lib/kubelet/pods/5736922b-33dd-4ac5-8c9b-d199c72b5c78/volumes" Oct 06 15:10:33 crc kubenswrapper[4763]: I1006 15:10:33.397867 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:33 crc kubenswrapper[4763]: I1006 15:10:33.705808 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:10:33 crc kubenswrapper[4763]: I1006 15:10:33.870121 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-q287r"] Oct 06 15:10:33 crc kubenswrapper[4763]: I1006 15:10:33.878105 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:10:33 crc kubenswrapper[4763]: I1006 15:10:33.878155 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:10:34 crc kubenswrapper[4763]: I1006 15:10:34.298467 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" podUID="024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" containerName="dnsmasq-dns" containerID="cri-o://fe138415dc947a471df08b86b03b53a815c137e140183219b500f293a4b6ab0b" gracePeriod=10 Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.306460 4763 generic.go:334] "Generic (PLEG): container finished" podID="024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" containerID="fe138415dc947a471df08b86b03b53a815c137e140183219b500f293a4b6ab0b" exitCode=0 Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.306588 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" event={"ID":"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31","Type":"ContainerDied","Data":"fe138415dc947a471df08b86b03b53a815c137e140183219b500f293a4b6ab0b"} Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.307021 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" event={"ID":"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31","Type":"ContainerDied","Data":"8dbd22d090e37630cfc3d0f9cd102643cf408f7cf533e7560ae3fd85a10075bb"} Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.307038 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dbd22d090e37630cfc3d0f9cd102643cf408f7cf533e7560ae3fd85a10075bb" Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.390155 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.550786 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp8tc\" (UniqueName: \"kubernetes.io/projected/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-kube-api-access-vp8tc\") pod \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.552072 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-dns-svc\") pod \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.552252 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-config\") pod \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\" (UID: \"024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31\") " Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.578072 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-kube-api-access-vp8tc" (OuterVolumeSpecName: "kube-api-access-vp8tc") pod "024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" (UID: "024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31"). InnerVolumeSpecName "kube-api-access-vp8tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.654573 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp8tc\" (UniqueName: \"kubernetes.io/projected/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-kube-api-access-vp8tc\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.904159 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-config" (OuterVolumeSpecName: "config") pod "024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" (UID: "024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.906163 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" (UID: "024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.958921 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:35 crc kubenswrapper[4763]: I1006 15:10:35.958963 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.319597 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cf4dn" event={"ID":"d14df013-8cb0-4f11-b69d-a52002788320","Type":"ContainerStarted","Data":"1d3e6fcc20427b0cfe161638b80bcdb635dc69220c8ec3c37520d536e807fc49"} Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.321489 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b82aad8-d8ea-4e3f-9b7f-973232a5ffac","Type":"ContainerStarted","Data":"993dfcf56ad01e08872a874c38355b60188342f405c463c9cfdb706de429021f"} Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.322025 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.324430 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"42d3e722-26a6-40fa-9762-7da59b0009b7","Type":"ContainerStarted","Data":"8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e"} Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.326149 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lw4hs" event={"ID":"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7","Type":"ContainerStarted","Data":"3c4b01032b0edcf17554003538869b76d9315f50af3f11dcd288516a9dc13969"} Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.326572 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lw4hs" Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.328260 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de","Type":"ContainerStarted","Data":"6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719"} Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.330035 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1bcb31ee-4374-46aa-ab52-39f216f2bf67","Type":"ContainerStarted","Data":"37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee"} Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.331941 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b444a5f8-8311-488c-b612-2d44328edc52","Type":"ContainerStarted","Data":"5c5d1d55b187f82aaee42975506945a3d1714a7b414a61bf50e5ba10dc72666f"} Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.337298 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-q287r" Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.341536 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"95acc4bd-d14c-4204-b20e-36085edffb73","Type":"ContainerStarted","Data":"bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d"} Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.341589 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.365085 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.974256079 podStartE2EDuration="17.365059457s" podCreationTimestamp="2025-10-06 15:10:19 +0000 UTC" firstStartedPulling="2025-10-06 15:10:26.834026678 +0000 UTC m=+1023.989319190" lastFinishedPulling="2025-10-06 15:10:35.224830036 +0000 UTC m=+1032.380122568" observedRunningTime="2025-10-06 15:10:36.361869319 +0000 UTC m=+1033.517161831" watchObservedRunningTime="2025-10-06 15:10:36.365059457 +0000 UTC m=+1033.520351969" Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.406228 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.253761511 podStartE2EDuration="19.406210705s" podCreationTimestamp="2025-10-06 15:10:17 +0000 UTC" firstStartedPulling="2025-10-06 15:10:26.839004436 +0000 UTC m=+1023.994296948" lastFinishedPulling="2025-10-06 15:10:33.99145363 +0000 UTC m=+1031.146746142" observedRunningTime="2025-10-06 15:10:36.406044451 +0000 UTC m=+1033.561336963" watchObservedRunningTime="2025-10-06 15:10:36.406210705 +0000 UTC m=+1033.561503217" Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.448402 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lw4hs" podStartSLOduration=5.320570015 podStartE2EDuration="13.44838272s" podCreationTimestamp="2025-10-06 15:10:23 +0000 UTC" firstStartedPulling="2025-10-06 15:10:27.017133362 +0000 UTC m=+1024.172425874" lastFinishedPulling="2025-10-06 15:10:35.144946077 +0000 UTC m=+1032.300238579" observedRunningTime="2025-10-06 15:10:36.444196255 +0000 UTC m=+1033.599488767" watchObservedRunningTime="2025-10-06 15:10:36.44838272 +0000 UTC m=+1033.603675232" Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.462508 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-q287r"] Oct 06 15:10:36 crc kubenswrapper[4763]: I1006 15:10:36.469511 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-q287r"] Oct 06 15:10:37 crc kubenswrapper[4763]: I1006 15:10:37.346342 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2fad9bbe-33dc-4f1d-a156-52bbd3a69273","Type":"ContainerStarted","Data":"afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76"} Oct 06 15:10:37 crc kubenswrapper[4763]: I1006 15:10:37.349149 4763 generic.go:334] "Generic (PLEG): container finished" podID="d14df013-8cb0-4f11-b69d-a52002788320" containerID="1d3e6fcc20427b0cfe161638b80bcdb635dc69220c8ec3c37520d536e807fc49" exitCode=0 Oct 06 15:10:37 crc kubenswrapper[4763]: I1006 15:10:37.349235 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cf4dn" event={"ID":"d14df013-8cb0-4f11-b69d-a52002788320","Type":"ContainerDied","Data":"1d3e6fcc20427b0cfe161638b80bcdb635dc69220c8ec3c37520d536e807fc49"} Oct 06 15:10:37 crc kubenswrapper[4763]: I1006 15:10:37.351790 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43","Type":"ContainerStarted","Data":"e44c7c112f2e0d0aa4ddbe5721b5449d3464205575c5a4821512c86f3f926b10"} Oct 06 15:10:37 crc kubenswrapper[4763]: I1006 15:10:37.588725 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" path="/var/lib/kubelet/pods/024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31/volumes" Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.370124 4763 generic.go:334] "Generic (PLEG): container finished" podID="42d3e722-26a6-40fa-9762-7da59b0009b7" containerID="8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e" exitCode=0 Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.370255 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"42d3e722-26a6-40fa-9762-7da59b0009b7","Type":"ContainerDied","Data":"8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e"} Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.373492 4763 generic.go:334] "Generic (PLEG): container finished" podID="8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" containerID="6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719" exitCode=0 Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.373575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de","Type":"ContainerDied","Data":"6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719"} Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.375870 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1bcb31ee-4374-46aa-ab52-39f216f2bf67","Type":"ContainerStarted","Data":"2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc"} Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.384782 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b444a5f8-8311-488c-b612-2d44328edc52","Type":"ContainerStarted","Data":"a14cb1fee4bc452ad0d1ead1b1070a9175132789cf27b82cbc5f083baa390272"} Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.388682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cf4dn" event={"ID":"d14df013-8cb0-4f11-b69d-a52002788320","Type":"ContainerStarted","Data":"645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd"} Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.388723 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cf4dn" event={"ID":"d14df013-8cb0-4f11-b69d-a52002788320","Type":"ContainerStarted","Data":"0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86"} Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.389402 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.389467 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.438022 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.438070 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.475206 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.851948416 podStartE2EDuration="16.47518689s" podCreationTimestamp="2025-10-06 15:10:23 +0000 UTC" firstStartedPulling="2025-10-06 15:10:27.12014734 +0000 UTC m=+1024.275439852" lastFinishedPulling="2025-10-06 15:10:38.743385804 +0000 UTC m=+1035.898678326" observedRunningTime="2025-10-06 15:10:39.468070254 +0000 UTC m=+1036.623362776" watchObservedRunningTime="2025-10-06 15:10:39.47518689 +0000 UTC m=+1036.630479412" Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.489487 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.141490943 podStartE2EDuration="15.489469095s" podCreationTimestamp="2025-10-06 15:10:24 +0000 UTC" firstStartedPulling="2025-10-06 15:10:27.371963854 +0000 UTC m=+1024.527256356" lastFinishedPulling="2025-10-06 15:10:38.719941966 +0000 UTC m=+1035.875234508" observedRunningTime="2025-10-06 15:10:39.485718402 +0000 UTC m=+1036.641010914" watchObservedRunningTime="2025-10-06 15:10:39.489469095 +0000 UTC m=+1036.644761617" Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.499158 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:39 crc kubenswrapper[4763]: I1006 15:10:39.535071 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cf4dn" podStartSLOduration=9.265025261 podStartE2EDuration="16.535054946s" podCreationTimestamp="2025-10-06 15:10:23 +0000 UTC" firstStartedPulling="2025-10-06 15:10:27.878057059 +0000 UTC m=+1025.033349571" lastFinishedPulling="2025-10-06 15:10:35.148086744 +0000 UTC m=+1032.303379256" observedRunningTime="2025-10-06 15:10:39.513924892 +0000 UTC m=+1036.669217414" watchObservedRunningTime="2025-10-06 15:10:39.535054946 +0000 UTC m=+1036.690347458" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.400496 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"42d3e722-26a6-40fa-9762-7da59b0009b7","Type":"ContainerStarted","Data":"2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71"} Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.403040 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de","Type":"ContainerStarted","Data":"906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080"} Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.441747 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=16.501330941 podStartE2EDuration="24.441715534s" podCreationTimestamp="2025-10-06 15:10:16 +0000 UTC" firstStartedPulling="2025-10-06 15:10:26.839357686 +0000 UTC m=+1023.994650198" lastFinishedPulling="2025-10-06 15:10:34.779742269 +0000 UTC m=+1031.935034791" observedRunningTime="2025-10-06 15:10:40.431295477 +0000 UTC m=+1037.586588049" watchObservedRunningTime="2025-10-06 15:10:40.441715534 +0000 UTC m=+1037.597008126" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.450221 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.457546 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.127845072 podStartE2EDuration="26.45752479s" podCreationTimestamp="2025-10-06 15:10:14 +0000 UTC" firstStartedPulling="2025-10-06 15:10:26.81530234 +0000 UTC m=+1023.970594872" lastFinishedPulling="2025-10-06 15:10:35.144982078 +0000 UTC m=+1032.300274590" observedRunningTime="2025-10-06 15:10:40.45207299 +0000 UTC m=+1037.607365542" watchObservedRunningTime="2025-10-06 15:10:40.45752479 +0000 UTC m=+1037.612817332" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.773984 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mqpgb"] Oct 06 15:10:40 crc kubenswrapper[4763]: E1006 15:10:40.777361 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" containerName="dnsmasq-dns" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.777390 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" containerName="dnsmasq-dns" Oct 06 15:10:40 crc kubenswrapper[4763]: E1006 15:10:40.777436 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" containerName="init" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.777442 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" containerName="init" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.777688 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="024fe1c7-e7a0-4ae6-a8ea-63f51fcb9f31" containerName="dnsmasq-dns" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.778643 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.782786 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.794035 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mqpgb"] Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.807830 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6cpk4"] Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.808995 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.811495 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.827304 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6cpk4"] Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.948250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovn-rundir\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.948292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-combined-ca-bundle\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.948322 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.948354 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.948560 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-config\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.948632 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nn9q\" (UniqueName: \"kubernetes.io/projected/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-kube-api-access-4nn9q\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.948760 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.948831 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-config\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.948916 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xplq\" (UniqueName: \"kubernetes.io/projected/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-kube-api-access-8xplq\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:40 crc kubenswrapper[4763]: I1006 15:10:40.948978 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovs-rundir\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.051019 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xplq\" (UniqueName: \"kubernetes.io/projected/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-kube-api-access-8xplq\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.051087 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovs-rundir\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.051127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovn-rundir\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.051164 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-combined-ca-bundle\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.051456 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovn-rundir\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.051480 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovs-rundir\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.052288 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.052324 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.052369 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.052396 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-config\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.052757 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nn9q\" (UniqueName: \"kubernetes.io/projected/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-kube-api-access-4nn9q\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.052810 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.052841 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-config\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.053215 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-config\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.053397 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-config\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.053596 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.055448 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-combined-ca-bundle\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.055824 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.066822 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mqpgb"] Oct 06 15:10:41 crc kubenswrapper[4763]: E1006 15:10:41.067421 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-4nn9q], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" podUID="710c2f94-b98f-4e37-8c01-8b0fe12eb76f" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.070528 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xplq\" (UniqueName: \"kubernetes.io/projected/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-kube-api-access-8xplq\") pod \"ovn-controller-metrics-6cpk4\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.075668 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nn9q\" (UniqueName: \"kubernetes.io/projected/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-kube-api-access-4nn9q\") pod \"dnsmasq-dns-7fd796d7df-mqpgb\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.087291 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hxbfq"] Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.088793 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.092012 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.099972 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hxbfq"] Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.128923 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.194356 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.194689 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.242946 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.257698 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.257746 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rh4w\" (UniqueName: \"kubernetes.io/projected/f08be146-c9e1-4d1a-9a0a-e3009886d765-kube-api-access-6rh4w\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.257805 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-config\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.257837 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.259258 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.360934 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-config\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.360997 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.361040 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.361148 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.361179 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rh4w\" (UniqueName: \"kubernetes.io/projected/f08be146-c9e1-4d1a-9a0a-e3009886d765-kube-api-access-6rh4w\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.361890 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-config\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.362017 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.362027 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.362754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.384006 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rh4w\" (UniqueName: \"kubernetes.io/projected/f08be146-c9e1-4d1a-9a0a-e3009886d765-kube-api-access-6rh4w\") pod \"dnsmasq-dns-86db49b7ff-hxbfq\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.412052 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.431088 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.433195 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.455182 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.564769 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-config\") pod \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.565096 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-ovsdbserver-nb\") pod \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.565229 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nn9q\" (UniqueName: \"kubernetes.io/projected/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-kube-api-access-4nn9q\") pod \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.565281 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-dns-svc\") pod \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\" (UID: \"710c2f94-b98f-4e37-8c01-8b0fe12eb76f\") " Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.565443 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-config" (OuterVolumeSpecName: "config") pod "710c2f94-b98f-4e37-8c01-8b0fe12eb76f" (UID: "710c2f94-b98f-4e37-8c01-8b0fe12eb76f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.566150 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.566221 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "710c2f94-b98f-4e37-8c01-8b0fe12eb76f" (UID: "710c2f94-b98f-4e37-8c01-8b0fe12eb76f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.567979 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "710c2f94-b98f-4e37-8c01-8b0fe12eb76f" (UID: "710c2f94-b98f-4e37-8c01-8b0fe12eb76f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.574890 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-kube-api-access-4nn9q" (OuterVolumeSpecName: "kube-api-access-4nn9q") pod "710c2f94-b98f-4e37-8c01-8b0fe12eb76f" (UID: "710c2f94-b98f-4e37-8c01-8b0fe12eb76f"). InnerVolumeSpecName "kube-api-access-4nn9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.595977 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6cpk4"] Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.603473 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.605348 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.608586 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.609326 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.609357 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-fntwn" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.609525 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.616004 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.667682 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nn9q\" (UniqueName: \"kubernetes.io/projected/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-kube-api-access-4nn9q\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.667713 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.667724 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/710c2f94-b98f-4e37-8c01-8b0fe12eb76f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.768840 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.768883 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.768914 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-scripts\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.769088 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-config\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.769190 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krdsp\" (UniqueName: \"kubernetes.io/projected/b7530761-b715-4178-8d58-5e1cd54838d0-kube-api-access-krdsp\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.769219 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.769482 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.870910 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.870968 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.870990 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.871016 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-scripts\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.871039 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-config\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.871064 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krdsp\" (UniqueName: \"kubernetes.io/projected/b7530761-b715-4178-8d58-5e1cd54838d0-kube-api-access-krdsp\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.871080 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.872722 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-scripts\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.872884 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.873456 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-config\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.875985 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.876068 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.876725 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.893920 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krdsp\" (UniqueName: \"kubernetes.io/projected/b7530761-b715-4178-8d58-5e1cd54838d0-kube-api-access-krdsp\") pod \"ovn-northd-0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " pod="openstack/ovn-northd-0" Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.910222 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hxbfq"] Oct 06 15:10:41 crc kubenswrapper[4763]: I1006 15:10:41.932682 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 15:10:42 crc kubenswrapper[4763]: I1006 15:10:42.364090 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 15:10:42 crc kubenswrapper[4763]: W1006 15:10:42.378540 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7530761_b715_4178_8d58_5e1cd54838d0.slice/crio-05a1f91ee9e5276b1fa040704a0637760883909173eddc9ab23aac7d8f9463b8 WatchSource:0}: Error finding container 05a1f91ee9e5276b1fa040704a0637760883909173eddc9ab23aac7d8f9463b8: Status 404 returned error can't find the container with id 05a1f91ee9e5276b1fa040704a0637760883909173eddc9ab23aac7d8f9463b8 Oct 06 15:10:42 crc kubenswrapper[4763]: I1006 15:10:42.421464 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" event={"ID":"f08be146-c9e1-4d1a-9a0a-e3009886d765","Type":"ContainerStarted","Data":"0ff89682735479b6104f68f0c8a41299e918aee4bea6bf49d6ad8efedc0c5023"} Oct 06 15:10:42 crc kubenswrapper[4763]: I1006 15:10:42.423151 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b7530761-b715-4178-8d58-5e1cd54838d0","Type":"ContainerStarted","Data":"05a1f91ee9e5276b1fa040704a0637760883909173eddc9ab23aac7d8f9463b8"} Oct 06 15:10:42 crc kubenswrapper[4763]: I1006 15:10:42.424316 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-mqpgb" Oct 06 15:10:42 crc kubenswrapper[4763]: I1006 15:10:42.424364 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6cpk4" event={"ID":"40580e5d-8c54-477e-af15-1ba2cf5d3dc0","Type":"ContainerStarted","Data":"e0589c2b199d7a1eeef3ee81d2e36a477aad249ea3279f2222f2ef53b2fecd65"} Oct 06 15:10:42 crc kubenswrapper[4763]: I1006 15:10:42.471240 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mqpgb"] Oct 06 15:10:42 crc kubenswrapper[4763]: I1006 15:10:42.483234 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mqpgb"] Oct 06 15:10:43 crc kubenswrapper[4763]: I1006 15:10:43.068896 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 06 15:10:43 crc kubenswrapper[4763]: I1006 15:10:43.435694 4763 generic.go:334] "Generic (PLEG): container finished" podID="f08be146-c9e1-4d1a-9a0a-e3009886d765" containerID="8a6b4a0dde30127c75d4b33ee3affd2a273cf10910c01ef6a2ab8d3ec7a313b8" exitCode=0 Oct 06 15:10:43 crc kubenswrapper[4763]: I1006 15:10:43.435775 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" event={"ID":"f08be146-c9e1-4d1a-9a0a-e3009886d765","Type":"ContainerDied","Data":"8a6b4a0dde30127c75d4b33ee3affd2a273cf10910c01ef6a2ab8d3ec7a313b8"} Oct 06 15:10:43 crc kubenswrapper[4763]: I1006 15:10:43.438864 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6cpk4" event={"ID":"40580e5d-8c54-477e-af15-1ba2cf5d3dc0","Type":"ContainerStarted","Data":"8095d8ba556be14995d51b888d8ed4a97695bfea0a9a26d8a90c3415861649c3"} Oct 06 15:10:43 crc kubenswrapper[4763]: I1006 15:10:43.474501 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6cpk4" podStartSLOduration=3.474474224 podStartE2EDuration="3.474474224s" podCreationTimestamp="2025-10-06 15:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:10:43.469816116 +0000 UTC m=+1040.625108638" watchObservedRunningTime="2025-10-06 15:10:43.474474224 +0000 UTC m=+1040.629766786" Oct 06 15:10:43 crc kubenswrapper[4763]: I1006 15:10:43.587077 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="710c2f94-b98f-4e37-8c01-8b0fe12eb76f" path="/var/lib/kubelet/pods/710c2f94-b98f-4e37-8c01-8b0fe12eb76f/volumes" Oct 06 15:10:44 crc kubenswrapper[4763]: I1006 15:10:44.449737 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" event={"ID":"f08be146-c9e1-4d1a-9a0a-e3009886d765","Type":"ContainerStarted","Data":"7242f319d6c1a54962bac278a1084432ab94c61f55550301719a3115be8c7455"} Oct 06 15:10:44 crc kubenswrapper[4763]: I1006 15:10:44.450173 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:44 crc kubenswrapper[4763]: I1006 15:10:44.453896 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b7530761-b715-4178-8d58-5e1cd54838d0","Type":"ContainerStarted","Data":"27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950"} Oct 06 15:10:44 crc kubenswrapper[4763]: I1006 15:10:44.454020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b7530761-b715-4178-8d58-5e1cd54838d0","Type":"ContainerStarted","Data":"8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8"} Oct 06 15:10:44 crc kubenswrapper[4763]: I1006 15:10:44.454186 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 06 15:10:44 crc kubenswrapper[4763]: I1006 15:10:44.482539 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" podStartSLOduration=3.482515374 podStartE2EDuration="3.482515374s" podCreationTimestamp="2025-10-06 15:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:10:44.472417885 +0000 UTC m=+1041.627710407" watchObservedRunningTime="2025-10-06 15:10:44.482515374 +0000 UTC m=+1041.637807886" Oct 06 15:10:44 crc kubenswrapper[4763]: I1006 15:10:44.503887 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.305796935 podStartE2EDuration="3.503864802s" podCreationTimestamp="2025-10-06 15:10:41 +0000 UTC" firstStartedPulling="2025-10-06 15:10:42.381739929 +0000 UTC m=+1039.537032481" lastFinishedPulling="2025-10-06 15:10:43.579807836 +0000 UTC m=+1040.735100348" observedRunningTime="2025-10-06 15:10:44.496968612 +0000 UTC m=+1041.652261164" watchObservedRunningTime="2025-10-06 15:10:44.503864802 +0000 UTC m=+1041.659157314" Oct 06 15:10:46 crc kubenswrapper[4763]: I1006 15:10:46.303715 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 06 15:10:46 crc kubenswrapper[4763]: I1006 15:10:46.304168 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 06 15:10:46 crc kubenswrapper[4763]: I1006 15:10:46.372972 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 06 15:10:46 crc kubenswrapper[4763]: I1006 15:10:46.535700 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.638044 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.638382 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.695683 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.751397 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-qlkzb"] Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.753988 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qlkzb" Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.773059 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qlkzb"] Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.854428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55cdl\" (UniqueName: \"kubernetes.io/projected/0a8ce03a-8961-4c2d-858f-1d808f76c115-kube-api-access-55cdl\") pod \"keystone-db-create-qlkzb\" (UID: \"0a8ce03a-8961-4c2d-858f-1d808f76c115\") " pod="openstack/keystone-db-create-qlkzb" Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.955926 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55cdl\" (UniqueName: \"kubernetes.io/projected/0a8ce03a-8961-4c2d-858f-1d808f76c115-kube-api-access-55cdl\") pod \"keystone-db-create-qlkzb\" (UID: \"0a8ce03a-8961-4c2d-858f-1d808f76c115\") " pod="openstack/keystone-db-create-qlkzb" Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.986935 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zmbmn"] Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.988409 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zmbmn" Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.993857 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55cdl\" (UniqueName: \"kubernetes.io/projected/0a8ce03a-8961-4c2d-858f-1d808f76c115-kube-api-access-55cdl\") pod \"keystone-db-create-qlkzb\" (UID: \"0a8ce03a-8961-4c2d-858f-1d808f76c115\") " pod="openstack/keystone-db-create-qlkzb" Oct 06 15:10:47 crc kubenswrapper[4763]: I1006 15:10:47.998977 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zmbmn"] Oct 06 15:10:48 crc kubenswrapper[4763]: I1006 15:10:48.086020 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qlkzb" Oct 06 15:10:48 crc kubenswrapper[4763]: I1006 15:10:48.163565 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6khf8\" (UniqueName: \"kubernetes.io/projected/0f34f668-f8f5-4575-88b8-28af7c3c97c7-kube-api-access-6khf8\") pod \"placement-db-create-zmbmn\" (UID: \"0f34f668-f8f5-4575-88b8-28af7c3c97c7\") " pod="openstack/placement-db-create-zmbmn" Oct 06 15:10:48 crc kubenswrapper[4763]: I1006 15:10:48.265681 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6khf8\" (UniqueName: \"kubernetes.io/projected/0f34f668-f8f5-4575-88b8-28af7c3c97c7-kube-api-access-6khf8\") pod \"placement-db-create-zmbmn\" (UID: \"0f34f668-f8f5-4575-88b8-28af7c3c97c7\") " pod="openstack/placement-db-create-zmbmn" Oct 06 15:10:48 crc kubenswrapper[4763]: I1006 15:10:48.287110 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6khf8\" (UniqueName: \"kubernetes.io/projected/0f34f668-f8f5-4575-88b8-28af7c3c97c7-kube-api-access-6khf8\") pod \"placement-db-create-zmbmn\" (UID: \"0f34f668-f8f5-4575-88b8-28af7c3c97c7\") " pod="openstack/placement-db-create-zmbmn" Oct 06 15:10:48 crc kubenswrapper[4763]: I1006 15:10:48.375799 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zmbmn" Oct 06 15:10:48 crc kubenswrapper[4763]: I1006 15:10:48.544310 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 06 15:10:48 crc kubenswrapper[4763]: I1006 15:10:48.575478 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qlkzb"] Oct 06 15:10:48 crc kubenswrapper[4763]: I1006 15:10:48.830466 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zmbmn"] Oct 06 15:10:48 crc kubenswrapper[4763]: W1006 15:10:48.838881 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f34f668_f8f5_4575_88b8_28af7c3c97c7.slice/crio-89d1742668b3924b0344dec24dd65617f613d83678eafcff99ce2a64cb526e46 WatchSource:0}: Error finding container 89d1742668b3924b0344dec24dd65617f613d83678eafcff99ce2a64cb526e46: Status 404 returned error can't find the container with id 89d1742668b3924b0344dec24dd65617f613d83678eafcff99ce2a64cb526e46 Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.498339 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zmbmn" event={"ID":"0f34f668-f8f5-4575-88b8-28af7c3c97c7","Type":"ContainerStarted","Data":"89d1742668b3924b0344dec24dd65617f613d83678eafcff99ce2a64cb526e46"} Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.499811 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qlkzb" event={"ID":"0a8ce03a-8961-4c2d-858f-1d808f76c115","Type":"ContainerStarted","Data":"6070981fc47549629b450ed7cf508eaabf45ee46c8a9e94d67de7949125cff51"} Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.658084 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hxbfq"] Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.658344 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" podUID="f08be146-c9e1-4d1a-9a0a-e3009886d765" containerName="dnsmasq-dns" containerID="cri-o://7242f319d6c1a54962bac278a1084432ab94c61f55550301719a3115be8c7455" gracePeriod=10 Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.665792 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.696957 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.700133 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-zj5tx"] Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.703935 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.716261 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-zj5tx"] Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.797558 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-dns-svc\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.797641 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-config\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.797695 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.797721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2tdh\" (UniqueName: \"kubernetes.io/projected/2179e949-55aa-41a1-a5de-6a9be811df18-kube-api-access-f2tdh\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.797806 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.899213 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-dns-svc\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.899255 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-config\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.899295 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.899313 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2tdh\" (UniqueName: \"kubernetes.io/projected/2179e949-55aa-41a1-a5de-6a9be811df18-kube-api-access-f2tdh\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.899343 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.900285 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.900328 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-dns-svc\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.900482 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.900498 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-config\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:49 crc kubenswrapper[4763]: I1006 15:10:49.937708 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2tdh\" (UniqueName: \"kubernetes.io/projected/2179e949-55aa-41a1-a5de-6a9be811df18-kube-api-access-f2tdh\") pod \"dnsmasq-dns-698758b865-zj5tx\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.032051 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.500294 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-zj5tx"] Oct 06 15:10:50 crc kubenswrapper[4763]: W1006 15:10:50.510117 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2179e949_55aa_41a1_a5de_6a9be811df18.slice/crio-ab5a2a7cc451fceddb81a79c53df7770fc0cac412f91b8c0c8e5723dd2d8d749 WatchSource:0}: Error finding container ab5a2a7cc451fceddb81a79c53df7770fc0cac412f91b8c0c8e5723dd2d8d749: Status 404 returned error can't find the container with id ab5a2a7cc451fceddb81a79c53df7770fc0cac412f91b8c0c8e5723dd2d8d749 Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.513268 4763 generic.go:334] "Generic (PLEG): container finished" podID="f08be146-c9e1-4d1a-9a0a-e3009886d765" containerID="7242f319d6c1a54962bac278a1084432ab94c61f55550301719a3115be8c7455" exitCode=0 Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.513319 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" event={"ID":"f08be146-c9e1-4d1a-9a0a-e3009886d765","Type":"ContainerDied","Data":"7242f319d6c1a54962bac278a1084432ab94c61f55550301719a3115be8c7455"} Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.787519 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.794378 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.800247 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rw8cj" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.800440 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.800485 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.800721 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.800979 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.917234 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.917271 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h449c\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-kube-api-access-h449c\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.917382 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.917397 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-cache\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:50 crc kubenswrapper[4763]: I1006 15:10:50.917420 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-lock\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.019469 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.019519 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-cache\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.019559 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-lock\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.019605 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.019653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h449c\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-kube-api-access-h449c\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: E1006 15:10:51.019696 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 15:10:51 crc kubenswrapper[4763]: E1006 15:10:51.019730 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 15:10:51 crc kubenswrapper[4763]: E1006 15:10:51.019787 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift podName:84d1d27d-b811-4100-9366-b71d6ae0f4a0 nodeName:}" failed. No retries permitted until 2025-10-06 15:10:51.519768491 +0000 UTC m=+1048.675061013 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift") pod "swift-storage-0" (UID: "84d1d27d-b811-4100-9366-b71d6ae0f4a0") : configmap "swift-ring-files" not found Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.020144 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-cache\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.020218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-lock\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.020249 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.065119 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h449c\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-kube-api-access-h449c\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.069945 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.431868 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" podUID="f08be146-c9e1-4d1a-9a0a-e3009886d765" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.522561 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-zj5tx" event={"ID":"2179e949-55aa-41a1-a5de-6a9be811df18","Type":"ContainerStarted","Data":"ab5a2a7cc451fceddb81a79c53df7770fc0cac412f91b8c0c8e5723dd2d8d749"} Oct 06 15:10:51 crc kubenswrapper[4763]: I1006 15:10:51.529485 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:51 crc kubenswrapper[4763]: E1006 15:10:51.529716 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 15:10:51 crc kubenswrapper[4763]: E1006 15:10:51.529741 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 15:10:51 crc kubenswrapper[4763]: E1006 15:10:51.529796 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift podName:84d1d27d-b811-4100-9366-b71d6ae0f4a0 nodeName:}" failed. No retries permitted until 2025-10-06 15:10:52.529778267 +0000 UTC m=+1049.685070789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift") pod "swift-storage-0" (UID: "84d1d27d-b811-4100-9366-b71d6ae0f4a0") : configmap "swift-ring-files" not found Oct 06 15:10:52 crc kubenswrapper[4763]: I1006 15:10:52.534909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qlkzb" event={"ID":"0a8ce03a-8961-4c2d-858f-1d808f76c115","Type":"ContainerStarted","Data":"d961877ec4b14e2b7e6ef98679339178907bb759679beb038faba68e04b8d83c"} Oct 06 15:10:52 crc kubenswrapper[4763]: I1006 15:10:52.536950 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zmbmn" event={"ID":"0f34f668-f8f5-4575-88b8-28af7c3c97c7","Type":"ContainerStarted","Data":"f9d09efa24f76ffcf8c6cf177621756e703ca8bd1f580098a21fd80b3636e2ea"} Oct 06 15:10:52 crc kubenswrapper[4763]: I1006 15:10:52.544039 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:52 crc kubenswrapper[4763]: E1006 15:10:52.544215 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 15:10:52 crc kubenswrapper[4763]: E1006 15:10:52.544244 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 15:10:52 crc kubenswrapper[4763]: E1006 15:10:52.544288 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift podName:84d1d27d-b811-4100-9366-b71d6ae0f4a0 nodeName:}" failed. No retries permitted until 2025-10-06 15:10:54.544273005 +0000 UTC m=+1051.699565517 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift") pod "swift-storage-0" (UID: "84d1d27d-b811-4100-9366-b71d6ae0f4a0") : configmap "swift-ring-files" not found Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.220461 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.232600 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2flgn"] Oct 06 15:10:53 crc kubenswrapper[4763]: E1006 15:10:53.233055 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08be146-c9e1-4d1a-9a0a-e3009886d765" containerName="dnsmasq-dns" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.233079 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08be146-c9e1-4d1a-9a0a-e3009886d765" containerName="dnsmasq-dns" Oct 06 15:10:53 crc kubenswrapper[4763]: E1006 15:10:53.233104 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08be146-c9e1-4d1a-9a0a-e3009886d765" containerName="init" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.233113 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08be146-c9e1-4d1a-9a0a-e3009886d765" containerName="init" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.233326 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08be146-c9e1-4d1a-9a0a-e3009886d765" containerName="dnsmasq-dns" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.234048 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2flgn" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.244440 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2flgn"] Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.360490 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-config\") pod \"f08be146-c9e1-4d1a-9a0a-e3009886d765\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.360532 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rh4w\" (UniqueName: \"kubernetes.io/projected/f08be146-c9e1-4d1a-9a0a-e3009886d765-kube-api-access-6rh4w\") pod \"f08be146-c9e1-4d1a-9a0a-e3009886d765\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.360558 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-nb\") pod \"f08be146-c9e1-4d1a-9a0a-e3009886d765\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.360682 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-dns-svc\") pod \"f08be146-c9e1-4d1a-9a0a-e3009886d765\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.360723 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-sb\") pod \"f08be146-c9e1-4d1a-9a0a-e3009886d765\" (UID: \"f08be146-c9e1-4d1a-9a0a-e3009886d765\") " Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.361045 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67sq\" (UniqueName: \"kubernetes.io/projected/d9d61678-af2a-45c3-bf0e-3c5244a9390a-kube-api-access-f67sq\") pod \"glance-db-create-2flgn\" (UID: \"d9d61678-af2a-45c3-bf0e-3c5244a9390a\") " pod="openstack/glance-db-create-2flgn" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.365655 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f08be146-c9e1-4d1a-9a0a-e3009886d765-kube-api-access-6rh4w" (OuterVolumeSpecName: "kube-api-access-6rh4w") pod "f08be146-c9e1-4d1a-9a0a-e3009886d765" (UID: "f08be146-c9e1-4d1a-9a0a-e3009886d765"). InnerVolumeSpecName "kube-api-access-6rh4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.408665 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f08be146-c9e1-4d1a-9a0a-e3009886d765" (UID: "f08be146-c9e1-4d1a-9a0a-e3009886d765"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.413161 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f08be146-c9e1-4d1a-9a0a-e3009886d765" (UID: "f08be146-c9e1-4d1a-9a0a-e3009886d765"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.424555 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f08be146-c9e1-4d1a-9a0a-e3009886d765" (UID: "f08be146-c9e1-4d1a-9a0a-e3009886d765"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.425085 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-config" (OuterVolumeSpecName: "config") pod "f08be146-c9e1-4d1a-9a0a-e3009886d765" (UID: "f08be146-c9e1-4d1a-9a0a-e3009886d765"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.462995 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f67sq\" (UniqueName: \"kubernetes.io/projected/d9d61678-af2a-45c3-bf0e-3c5244a9390a-kube-api-access-f67sq\") pod \"glance-db-create-2flgn\" (UID: \"d9d61678-af2a-45c3-bf0e-3c5244a9390a\") " pod="openstack/glance-db-create-2flgn" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.463415 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.463486 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rh4w\" (UniqueName: \"kubernetes.io/projected/f08be146-c9e1-4d1a-9a0a-e3009886d765-kube-api-access-6rh4w\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.463507 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.463526 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.463606 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f08be146-c9e1-4d1a-9a0a-e3009886d765-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.484507 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67sq\" (UniqueName: \"kubernetes.io/projected/d9d61678-af2a-45c3-bf0e-3c5244a9390a-kube-api-access-f67sq\") pod \"glance-db-create-2flgn\" (UID: \"d9d61678-af2a-45c3-bf0e-3c5244a9390a\") " pod="openstack/glance-db-create-2flgn" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.548754 4763 generic.go:334] "Generic (PLEG): container finished" podID="0f34f668-f8f5-4575-88b8-28af7c3c97c7" containerID="f9d09efa24f76ffcf8c6cf177621756e703ca8bd1f580098a21fd80b3636e2ea" exitCode=0 Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.548861 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zmbmn" event={"ID":"0f34f668-f8f5-4575-88b8-28af7c3c97c7","Type":"ContainerDied","Data":"f9d09efa24f76ffcf8c6cf177621756e703ca8bd1f580098a21fd80b3636e2ea"} Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.552355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2flgn" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.552396 4763 generic.go:334] "Generic (PLEG): container finished" podID="2179e949-55aa-41a1-a5de-6a9be811df18" containerID="ac77de8718aa9799ad9e2971d6d8f7f5f4b6beba58865e521d8e158e329848a0" exitCode=0 Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.552430 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-zj5tx" event={"ID":"2179e949-55aa-41a1-a5de-6a9be811df18","Type":"ContainerDied","Data":"ac77de8718aa9799ad9e2971d6d8f7f5f4b6beba58865e521d8e158e329848a0"} Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.556877 4763 generic.go:334] "Generic (PLEG): container finished" podID="0a8ce03a-8961-4c2d-858f-1d808f76c115" containerID="d961877ec4b14e2b7e6ef98679339178907bb759679beb038faba68e04b8d83c" exitCode=0 Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.556993 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qlkzb" event={"ID":"0a8ce03a-8961-4c2d-858f-1d808f76c115","Type":"ContainerDied","Data":"d961877ec4b14e2b7e6ef98679339178907bb759679beb038faba68e04b8d83c"} Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.560820 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" event={"ID":"f08be146-c9e1-4d1a-9a0a-e3009886d765","Type":"ContainerDied","Data":"0ff89682735479b6104f68f0c8a41299e918aee4bea6bf49d6ad8efedc0c5023"} Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.560883 4763 scope.go:117] "RemoveContainer" containerID="7242f319d6c1a54962bac278a1084432ab94c61f55550301719a3115be8c7455" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.561061 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hxbfq" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.611932 4763 scope.go:117] "RemoveContainer" containerID="8a6b4a0dde30127c75d4b33ee3affd2a273cf10910c01ef6a2ab8d3ec7a313b8" Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.665568 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hxbfq"] Oct 06 15:10:53 crc kubenswrapper[4763]: I1006 15:10:53.672855 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hxbfq"] Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.022712 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2flgn"] Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.588968 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-zj5tx" event={"ID":"2179e949-55aa-41a1-a5de-6a9be811df18","Type":"ContainerStarted","Data":"52d31a65e70cf7edf041177a17133c179009219d839b7894b441830e400538c1"} Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.589513 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.593881 4763 generic.go:334] "Generic (PLEG): container finished" podID="d9d61678-af2a-45c3-bf0e-3c5244a9390a" containerID="183e3372d23d8635d4405d3ea33596bff482fb5dc71ccbaf10e36cd781aa4b28" exitCode=0 Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.594087 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2flgn" event={"ID":"d9d61678-af2a-45c3-bf0e-3c5244a9390a","Type":"ContainerDied","Data":"183e3372d23d8635d4405d3ea33596bff482fb5dc71ccbaf10e36cd781aa4b28"} Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.594110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2flgn" event={"ID":"d9d61678-af2a-45c3-bf0e-3c5244a9390a","Type":"ContainerStarted","Data":"b1dafea37e82eb3ca3f9ffdc9f6318856b12fccd6a90526abf9f28f1341756ae"} Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.610488 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-zj5tx" podStartSLOduration=5.610473466 podStartE2EDuration="5.610473466s" podCreationTimestamp="2025-10-06 15:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:10:54.610317302 +0000 UTC m=+1051.765609814" watchObservedRunningTime="2025-10-06 15:10:54.610473466 +0000 UTC m=+1051.765765978" Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.636796 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:54 crc kubenswrapper[4763]: E1006 15:10:54.637027 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 15:10:54 crc kubenswrapper[4763]: E1006 15:10:54.637042 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 15:10:54 crc kubenswrapper[4763]: E1006 15:10:54.637083 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift podName:84d1d27d-b811-4100-9366-b71d6ae0f4a0 nodeName:}" failed. No retries permitted until 2025-10-06 15:10:58.637069679 +0000 UTC m=+1055.792362191 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift") pod "swift-storage-0" (UID: "84d1d27d-b811-4100-9366-b71d6ae0f4a0") : configmap "swift-ring-files" not found Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.851917 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-r2m7t"] Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.852929 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.854636 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.854990 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.858067 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.866428 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-r2m7t"] Oct 06 15:10:54 crc kubenswrapper[4763]: I1006 15:10:54.936925 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zmbmn" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.042345 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6khf8\" (UniqueName: \"kubernetes.io/projected/0f34f668-f8f5-4575-88b8-28af7c3c97c7-kube-api-access-6khf8\") pod \"0f34f668-f8f5-4575-88b8-28af7c3c97c7\" (UID: \"0f34f668-f8f5-4575-88b8-28af7c3c97c7\") " Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.042703 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-dispersionconf\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.042752 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-combined-ca-bundle\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.042775 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zd9s\" (UniqueName: \"kubernetes.io/projected/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-kube-api-access-8zd9s\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.042795 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-swiftconf\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.042817 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-ring-data-devices\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.042837 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-etc-swift\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.042915 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-scripts\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.047636 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f34f668-f8f5-4575-88b8-28af7c3c97c7-kube-api-access-6khf8" (OuterVolumeSpecName: "kube-api-access-6khf8") pod "0f34f668-f8f5-4575-88b8-28af7c3c97c7" (UID: "0f34f668-f8f5-4575-88b8-28af7c3c97c7"). InnerVolumeSpecName "kube-api-access-6khf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.100217 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qlkzb" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.143890 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-scripts\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.144004 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-dispersionconf\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.144055 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-combined-ca-bundle\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.144083 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zd9s\" (UniqueName: \"kubernetes.io/projected/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-kube-api-access-8zd9s\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.144119 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-swiftconf\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.144151 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-ring-data-devices\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.144177 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-etc-swift\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.144275 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6khf8\" (UniqueName: \"kubernetes.io/projected/0f34f668-f8f5-4575-88b8-28af7c3c97c7-kube-api-access-6khf8\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.144691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-scripts\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.144850 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-etc-swift\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.145046 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-ring-data-devices\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.147604 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-swiftconf\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.150853 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-dispersionconf\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.151532 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-combined-ca-bundle\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.161050 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zd9s\" (UniqueName: \"kubernetes.io/projected/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-kube-api-access-8zd9s\") pod \"swift-ring-rebalance-r2m7t\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.234777 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.245818 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55cdl\" (UniqueName: \"kubernetes.io/projected/0a8ce03a-8961-4c2d-858f-1d808f76c115-kube-api-access-55cdl\") pod \"0a8ce03a-8961-4c2d-858f-1d808f76c115\" (UID: \"0a8ce03a-8961-4c2d-858f-1d808f76c115\") " Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.249966 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8ce03a-8961-4c2d-858f-1d808f76c115-kube-api-access-55cdl" (OuterVolumeSpecName: "kube-api-access-55cdl") pod "0a8ce03a-8961-4c2d-858f-1d808f76c115" (UID: "0a8ce03a-8961-4c2d-858f-1d808f76c115"). InnerVolumeSpecName "kube-api-access-55cdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.349549 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55cdl\" (UniqueName: \"kubernetes.io/projected/0a8ce03a-8961-4c2d-858f-1d808f76c115-kube-api-access-55cdl\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.584377 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f08be146-c9e1-4d1a-9a0a-e3009886d765" path="/var/lib/kubelet/pods/f08be146-c9e1-4d1a-9a0a-e3009886d765/volumes" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.602407 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zmbmn" event={"ID":"0f34f668-f8f5-4575-88b8-28af7c3c97c7","Type":"ContainerDied","Data":"89d1742668b3924b0344dec24dd65617f613d83678eafcff99ce2a64cb526e46"} Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.602449 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89d1742668b3924b0344dec24dd65617f613d83678eafcff99ce2a64cb526e46" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.602515 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zmbmn" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.603663 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qlkzb" event={"ID":"0a8ce03a-8961-4c2d-858f-1d808f76c115","Type":"ContainerDied","Data":"6070981fc47549629b450ed7cf508eaabf45ee46c8a9e94d67de7949125cff51"} Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.603690 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6070981fc47549629b450ed7cf508eaabf45ee46c8a9e94d67de7949125cff51" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.603701 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qlkzb" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.687746 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-r2m7t"] Oct 06 15:10:55 crc kubenswrapper[4763]: W1006 15:10:55.691742 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad00dc53_32d8_4edd_ab1c_e9467d8be9eb.slice/crio-3880a8c313687ca03f6f0968ff127f20a9333d77f4b26b733950fde207fcf14c WatchSource:0}: Error finding container 3880a8c313687ca03f6f0968ff127f20a9333d77f4b26b733950fde207fcf14c: Status 404 returned error can't find the container with id 3880a8c313687ca03f6f0968ff127f20a9333d77f4b26b733950fde207fcf14c Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.879327 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2flgn" Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.964071 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f67sq\" (UniqueName: \"kubernetes.io/projected/d9d61678-af2a-45c3-bf0e-3c5244a9390a-kube-api-access-f67sq\") pod \"d9d61678-af2a-45c3-bf0e-3c5244a9390a\" (UID: \"d9d61678-af2a-45c3-bf0e-3c5244a9390a\") " Oct 06 15:10:55 crc kubenswrapper[4763]: I1006 15:10:55.969906 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d61678-af2a-45c3-bf0e-3c5244a9390a-kube-api-access-f67sq" (OuterVolumeSpecName: "kube-api-access-f67sq") pod "d9d61678-af2a-45c3-bf0e-3c5244a9390a" (UID: "d9d61678-af2a-45c3-bf0e-3c5244a9390a"). InnerVolumeSpecName "kube-api-access-f67sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:10:56 crc kubenswrapper[4763]: I1006 15:10:56.066339 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f67sq\" (UniqueName: \"kubernetes.io/projected/d9d61678-af2a-45c3-bf0e-3c5244a9390a-kube-api-access-f67sq\") on node \"crc\" DevicePath \"\"" Oct 06 15:10:56 crc kubenswrapper[4763]: I1006 15:10:56.615654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-r2m7t" event={"ID":"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb","Type":"ContainerStarted","Data":"3880a8c313687ca03f6f0968ff127f20a9333d77f4b26b733950fde207fcf14c"} Oct 06 15:10:56 crc kubenswrapper[4763]: I1006 15:10:56.617717 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2flgn" event={"ID":"d9d61678-af2a-45c3-bf0e-3c5244a9390a","Type":"ContainerDied","Data":"b1dafea37e82eb3ca3f9ffdc9f6318856b12fccd6a90526abf9f28f1341756ae"} Oct 06 15:10:56 crc kubenswrapper[4763]: I1006 15:10:56.617739 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1dafea37e82eb3ca3f9ffdc9f6318856b12fccd6a90526abf9f28f1341756ae" Oct 06 15:10:56 crc kubenswrapper[4763]: I1006 15:10:56.617996 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2flgn" Oct 06 15:10:56 crc kubenswrapper[4763]: I1006 15:10:56.989877 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 06 15:10:58 crc kubenswrapper[4763]: I1006 15:10:58.710191 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:10:58 crc kubenswrapper[4763]: E1006 15:10:58.710920 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 15:10:58 crc kubenswrapper[4763]: E1006 15:10:58.711088 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 15:10:58 crc kubenswrapper[4763]: E1006 15:10:58.711133 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift podName:84d1d27d-b811-4100-9366-b71d6ae0f4a0 nodeName:}" failed. No retries permitted until 2025-10-06 15:11:06.711118955 +0000 UTC m=+1063.866411467 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift") pod "swift-storage-0" (UID: "84d1d27d-b811-4100-9366-b71d6ae0f4a0") : configmap "swift-ring-files" not found Oct 06 15:10:59 crc kubenswrapper[4763]: I1006 15:10:59.663996 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-r2m7t" event={"ID":"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb","Type":"ContainerStarted","Data":"39fd94c7915cbb68b0ee4017312fa1eba7294d2ea98ae1e2255811da30e8afb0"} Oct 06 15:10:59 crc kubenswrapper[4763]: I1006 15:10:59.691544 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-r2m7t" podStartSLOduration=2.179871457 podStartE2EDuration="5.691521764s" podCreationTimestamp="2025-10-06 15:10:54 +0000 UTC" firstStartedPulling="2025-10-06 15:10:55.694766988 +0000 UTC m=+1052.850059520" lastFinishedPulling="2025-10-06 15:10:59.206417315 +0000 UTC m=+1056.361709827" observedRunningTime="2025-10-06 15:10:59.684886151 +0000 UTC m=+1056.840178663" watchObservedRunningTime="2025-10-06 15:10:59.691521764 +0000 UTC m=+1056.846814276" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.033890 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.114710 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t8zrr"] Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.114980 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" podUID="ab996378-985c-4fa8-bcc8-1eccde288a1e" containerName="dnsmasq-dns" containerID="cri-o://3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e" gracePeriod=10 Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.547904 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.646284 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-dns-svc\") pod \"ab996378-985c-4fa8-bcc8-1eccde288a1e\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.646385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-config\") pod \"ab996378-985c-4fa8-bcc8-1eccde288a1e\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.646412 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvlxx\" (UniqueName: \"kubernetes.io/projected/ab996378-985c-4fa8-bcc8-1eccde288a1e-kube-api-access-dvlxx\") pod \"ab996378-985c-4fa8-bcc8-1eccde288a1e\" (UID: \"ab996378-985c-4fa8-bcc8-1eccde288a1e\") " Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.651816 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab996378-985c-4fa8-bcc8-1eccde288a1e-kube-api-access-dvlxx" (OuterVolumeSpecName: "kube-api-access-dvlxx") pod "ab996378-985c-4fa8-bcc8-1eccde288a1e" (UID: "ab996378-985c-4fa8-bcc8-1eccde288a1e"). InnerVolumeSpecName "kube-api-access-dvlxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.674572 4763 generic.go:334] "Generic (PLEG): container finished" podID="ab996378-985c-4fa8-bcc8-1eccde288a1e" containerID="3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e" exitCode=0 Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.674713 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" event={"ID":"ab996378-985c-4fa8-bcc8-1eccde288a1e","Type":"ContainerDied","Data":"3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e"} Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.674778 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" event={"ID":"ab996378-985c-4fa8-bcc8-1eccde288a1e","Type":"ContainerDied","Data":"9881b380b5b3d2fedf559ca6d737f14f0ab871aabb8275ad3bba15c3fc2b3b7c"} Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.674801 4763 scope.go:117] "RemoveContainer" containerID="3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.674900 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t8zrr" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.692050 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-config" (OuterVolumeSpecName: "config") pod "ab996378-985c-4fa8-bcc8-1eccde288a1e" (UID: "ab996378-985c-4fa8-bcc8-1eccde288a1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.695301 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab996378-985c-4fa8-bcc8-1eccde288a1e" (UID: "ab996378-985c-4fa8-bcc8-1eccde288a1e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.731763 4763 scope.go:117] "RemoveContainer" containerID="82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.748376 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.748409 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab996378-985c-4fa8-bcc8-1eccde288a1e-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.748421 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvlxx\" (UniqueName: \"kubernetes.io/projected/ab996378-985c-4fa8-bcc8-1eccde288a1e-kube-api-access-dvlxx\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.753296 4763 scope.go:117] "RemoveContainer" containerID="3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e" Oct 06 15:11:00 crc kubenswrapper[4763]: E1006 15:11:00.753708 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e\": container with ID starting with 3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e not found: ID does not exist" containerID="3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.753744 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e"} err="failed to get container status \"3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e\": rpc error: code = NotFound desc = could not find container \"3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e\": container with ID starting with 3fedd6bc8d7af236d6cdf9d301f4774c9dd4aa38b845e80773f648e3038e281e not found: ID does not exist" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.753769 4763 scope.go:117] "RemoveContainer" containerID="82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb" Oct 06 15:11:00 crc kubenswrapper[4763]: E1006 15:11:00.753994 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb\": container with ID starting with 82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb not found: ID does not exist" containerID="82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb" Oct 06 15:11:00 crc kubenswrapper[4763]: I1006 15:11:00.754027 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb"} err="failed to get container status \"82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb\": rpc error: code = NotFound desc = could not find container \"82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb\": container with ID starting with 82e85ff652018253c23d7164b620137d6cadc62586006e9c3617516ebe5d28bb not found: ID does not exist" Oct 06 15:11:01 crc kubenswrapper[4763]: I1006 15:11:01.008281 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t8zrr"] Oct 06 15:11:01 crc kubenswrapper[4763]: I1006 15:11:01.013601 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t8zrr"] Oct 06 15:11:01 crc kubenswrapper[4763]: I1006 15:11:01.592238 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab996378-985c-4fa8-bcc8-1eccde288a1e" path="/var/lib/kubelet/pods/ab996378-985c-4fa8-bcc8-1eccde288a1e/volumes" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.382534 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bac8-account-create-x9mm7"] Oct 06 15:11:03 crc kubenswrapper[4763]: E1006 15:11:03.383455 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8ce03a-8961-4c2d-858f-1d808f76c115" containerName="mariadb-database-create" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.383471 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8ce03a-8961-4c2d-858f-1d808f76c115" containerName="mariadb-database-create" Oct 06 15:11:03 crc kubenswrapper[4763]: E1006 15:11:03.383483 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f34f668-f8f5-4575-88b8-28af7c3c97c7" containerName="mariadb-database-create" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.383490 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f34f668-f8f5-4575-88b8-28af7c3c97c7" containerName="mariadb-database-create" Oct 06 15:11:03 crc kubenswrapper[4763]: E1006 15:11:03.383509 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab996378-985c-4fa8-bcc8-1eccde288a1e" containerName="dnsmasq-dns" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.383517 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab996378-985c-4fa8-bcc8-1eccde288a1e" containerName="dnsmasq-dns" Oct 06 15:11:03 crc kubenswrapper[4763]: E1006 15:11:03.383540 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab996378-985c-4fa8-bcc8-1eccde288a1e" containerName="init" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.383547 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab996378-985c-4fa8-bcc8-1eccde288a1e" containerName="init" Oct 06 15:11:03 crc kubenswrapper[4763]: E1006 15:11:03.383570 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d61678-af2a-45c3-bf0e-3c5244a9390a" containerName="mariadb-database-create" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.383578 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d61678-af2a-45c3-bf0e-3c5244a9390a" containerName="mariadb-database-create" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.383817 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f34f668-f8f5-4575-88b8-28af7c3c97c7" containerName="mariadb-database-create" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.383834 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8ce03a-8961-4c2d-858f-1d808f76c115" containerName="mariadb-database-create" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.383847 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d61678-af2a-45c3-bf0e-3c5244a9390a" containerName="mariadb-database-create" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.383860 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab996378-985c-4fa8-bcc8-1eccde288a1e" containerName="dnsmasq-dns" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.384661 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bac8-account-create-x9mm7" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.390186 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.395383 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76cqn\" (UniqueName: \"kubernetes.io/projected/97d34e81-6b03-42b5-a903-9f42e5618133-kube-api-access-76cqn\") pod \"glance-bac8-account-create-x9mm7\" (UID: \"97d34e81-6b03-42b5-a903-9f42e5618133\") " pod="openstack/glance-bac8-account-create-x9mm7" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.396343 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bac8-account-create-x9mm7"] Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.496933 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76cqn\" (UniqueName: \"kubernetes.io/projected/97d34e81-6b03-42b5-a903-9f42e5618133-kube-api-access-76cqn\") pod \"glance-bac8-account-create-x9mm7\" (UID: \"97d34e81-6b03-42b5-a903-9f42e5618133\") " pod="openstack/glance-bac8-account-create-x9mm7" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.526188 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76cqn\" (UniqueName: \"kubernetes.io/projected/97d34e81-6b03-42b5-a903-9f42e5618133-kube-api-access-76cqn\") pod \"glance-bac8-account-create-x9mm7\" (UID: \"97d34e81-6b03-42b5-a903-9f42e5618133\") " pod="openstack/glance-bac8-account-create-x9mm7" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.717853 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bac8-account-create-x9mm7" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.877326 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.877391 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.877450 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.878593 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fbccdc9483352b1f55c48bbe8b493186e2536c8da1ef82629a7b6dcba09e9ea"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:11:03 crc kubenswrapper[4763]: I1006 15:11:03.878753 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://6fbccdc9483352b1f55c48bbe8b493186e2536c8da1ef82629a7b6dcba09e9ea" gracePeriod=600 Oct 06 15:11:04 crc kubenswrapper[4763]: I1006 15:11:04.148218 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bac8-account-create-x9mm7"] Oct 06 15:11:04 crc kubenswrapper[4763]: W1006 15:11:04.153704 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97d34e81_6b03_42b5_a903_9f42e5618133.slice/crio-1252a5065d9fe2efef697f46e3dab3f402f0522ac3e34a2b3ff23b0e833ef30a WatchSource:0}: Error finding container 1252a5065d9fe2efef697f46e3dab3f402f0522ac3e34a2b3ff23b0e833ef30a: Status 404 returned error can't find the container with id 1252a5065d9fe2efef697f46e3dab3f402f0522ac3e34a2b3ff23b0e833ef30a Oct 06 15:11:04 crc kubenswrapper[4763]: I1006 15:11:04.715558 4763 generic.go:334] "Generic (PLEG): container finished" podID="97d34e81-6b03-42b5-a903-9f42e5618133" containerID="7e445100d7b145aaaf73332a81543e81ff0be4b2cbeb11b7c92527fe9a85e535" exitCode=0 Oct 06 15:11:04 crc kubenswrapper[4763]: I1006 15:11:04.715816 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bac8-account-create-x9mm7" event={"ID":"97d34e81-6b03-42b5-a903-9f42e5618133","Type":"ContainerDied","Data":"7e445100d7b145aaaf73332a81543e81ff0be4b2cbeb11b7c92527fe9a85e535"} Oct 06 15:11:04 crc kubenswrapper[4763]: I1006 15:11:04.715841 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bac8-account-create-x9mm7" event={"ID":"97d34e81-6b03-42b5-a903-9f42e5618133","Type":"ContainerStarted","Data":"1252a5065d9fe2efef697f46e3dab3f402f0522ac3e34a2b3ff23b0e833ef30a"} Oct 06 15:11:04 crc kubenswrapper[4763]: I1006 15:11:04.718662 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="6fbccdc9483352b1f55c48bbe8b493186e2536c8da1ef82629a7b6dcba09e9ea" exitCode=0 Oct 06 15:11:04 crc kubenswrapper[4763]: I1006 15:11:04.718726 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"6fbccdc9483352b1f55c48bbe8b493186e2536c8da1ef82629a7b6dcba09e9ea"} Oct 06 15:11:04 crc kubenswrapper[4763]: I1006 15:11:04.718762 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"2cc326b9fc4c544f26c5cf613aae0ed392475a5aae225e91805a959c1915374a"} Oct 06 15:11:04 crc kubenswrapper[4763]: I1006 15:11:04.718786 4763 scope.go:117] "RemoveContainer" containerID="0b167bea82fcc2f3729a095299a58826cc2314cb35b4e8eb0ed7c680899b999c" Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.119527 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bac8-account-create-x9mm7" Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.242998 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76cqn\" (UniqueName: \"kubernetes.io/projected/97d34e81-6b03-42b5-a903-9f42e5618133-kube-api-access-76cqn\") pod \"97d34e81-6b03-42b5-a903-9f42e5618133\" (UID: \"97d34e81-6b03-42b5-a903-9f42e5618133\") " Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.248122 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d34e81-6b03-42b5-a903-9f42e5618133-kube-api-access-76cqn" (OuterVolumeSpecName: "kube-api-access-76cqn") pod "97d34e81-6b03-42b5-a903-9f42e5618133" (UID: "97d34e81-6b03-42b5-a903-9f42e5618133"). InnerVolumeSpecName "kube-api-access-76cqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.344597 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76cqn\" (UniqueName: \"kubernetes.io/projected/97d34e81-6b03-42b5-a903-9f42e5618133-kube-api-access-76cqn\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.742406 4763 generic.go:334] "Generic (PLEG): container finished" podID="ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" containerID="39fd94c7915cbb68b0ee4017312fa1eba7294d2ea98ae1e2255811da30e8afb0" exitCode=0 Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.742495 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-r2m7t" event={"ID":"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb","Type":"ContainerDied","Data":"39fd94c7915cbb68b0ee4017312fa1eba7294d2ea98ae1e2255811da30e8afb0"} Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.746997 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bac8-account-create-x9mm7" event={"ID":"97d34e81-6b03-42b5-a903-9f42e5618133","Type":"ContainerDied","Data":"1252a5065d9fe2efef697f46e3dab3f402f0522ac3e34a2b3ff23b0e833ef30a"} Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.747029 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1252a5065d9fe2efef697f46e3dab3f402f0522ac3e34a2b3ff23b0e833ef30a" Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.747136 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bac8-account-create-x9mm7" Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.751956 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:11:06 crc kubenswrapper[4763]: I1006 15:11:06.771402 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift\") pod \"swift-storage-0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " pod="openstack/swift-storage-0" Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.025842 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.537338 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 15:11:07 crc kubenswrapper[4763]: W1006 15:11:07.554661 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d1d27d_b811_4100_9366_b71d6ae0f4a0.slice/crio-e97644fc94ac473bf028f6ed61b44913b5e8b70072987fa1c96eac4af0a82575 WatchSource:0}: Error finding container e97644fc94ac473bf028f6ed61b44913b5e8b70072987fa1c96eac4af0a82575: Status 404 returned error can't find the container with id e97644fc94ac473bf028f6ed61b44913b5e8b70072987fa1c96eac4af0a82575 Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.757915 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"e97644fc94ac473bf028f6ed61b44913b5e8b70072987fa1c96eac4af0a82575"} Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.803252 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-de27-account-create-jl2kv"] Oct 06 15:11:07 crc kubenswrapper[4763]: E1006 15:11:07.803678 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d34e81-6b03-42b5-a903-9f42e5618133" containerName="mariadb-account-create" Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.803704 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d34e81-6b03-42b5-a903-9f42e5618133" containerName="mariadb-account-create" Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.803865 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d34e81-6b03-42b5-a903-9f42e5618133" containerName="mariadb-account-create" Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.804395 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-de27-account-create-jl2kv" Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.807375 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.809787 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-de27-account-create-jl2kv"] Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.871406 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq8wd\" (UniqueName: \"kubernetes.io/projected/286e2cc3-89f5-4c6b-bf84-044f889676f8-kube-api-access-sq8wd\") pod \"keystone-de27-account-create-jl2kv\" (UID: \"286e2cc3-89f5-4c6b-bf84-044f889676f8\") " pod="openstack/keystone-de27-account-create-jl2kv" Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.974016 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq8wd\" (UniqueName: \"kubernetes.io/projected/286e2cc3-89f5-4c6b-bf84-044f889676f8-kube-api-access-sq8wd\") pod \"keystone-de27-account-create-jl2kv\" (UID: \"286e2cc3-89f5-4c6b-bf84-044f889676f8\") " pod="openstack/keystone-de27-account-create-jl2kv" Oct 06 15:11:07 crc kubenswrapper[4763]: I1006 15:11:07.999800 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq8wd\" (UniqueName: \"kubernetes.io/projected/286e2cc3-89f5-4c6b-bf84-044f889676f8-kube-api-access-sq8wd\") pod \"keystone-de27-account-create-jl2kv\" (UID: \"286e2cc3-89f5-4c6b-bf84-044f889676f8\") " pod="openstack/keystone-de27-account-create-jl2kv" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.072843 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.077100 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zd9s\" (UniqueName: \"kubernetes.io/projected/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-kube-api-access-8zd9s\") pod \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.077210 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-swiftconf\") pod \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.077260 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-dispersionconf\") pod \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.077344 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-ring-data-devices\") pod \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.077385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-combined-ca-bundle\") pod \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.077463 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-scripts\") pod \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.077583 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-etc-swift\") pod \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\" (UID: \"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb\") " Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.079466 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" (UID: "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.080513 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" (UID: "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.082792 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-kube-api-access-8zd9s" (OuterVolumeSpecName: "kube-api-access-8zd9s") pod "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" (UID: "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb"). InnerVolumeSpecName "kube-api-access-8zd9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.113226 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" (UID: "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.115882 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" (UID: "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.121653 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-19ad-account-create-x69kg"] Oct 06 15:11:08 crc kubenswrapper[4763]: E1006 15:11:08.122114 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" containerName="swift-ring-rebalance" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.122131 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" containerName="swift-ring-rebalance" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.122338 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" containerName="swift-ring-rebalance" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.122915 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-19ad-account-create-x69kg" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.124897 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.130029 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-19ad-account-create-x69kg"] Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.143101 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-de27-account-create-jl2kv" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.153172 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" (UID: "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.157252 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-scripts" (OuterVolumeSpecName: "scripts") pod "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" (UID: "ad00dc53-32d8-4edd-ab1c-e9467d8be9eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.179147 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8sp9\" (UniqueName: \"kubernetes.io/projected/d34374ef-f967-4050-bca8-ec81d585ee6e-kube-api-access-p8sp9\") pod \"placement-19ad-account-create-x69kg\" (UID: \"d34374ef-f967-4050-bca8-ec81d585ee6e\") " pod="openstack/placement-19ad-account-create-x69kg" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.179391 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.179406 4763 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.179418 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zd9s\" (UniqueName: \"kubernetes.io/projected/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-kube-api-access-8zd9s\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.179429 4763 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.179439 4763 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.179448 4763 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.179460 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.280553 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8sp9\" (UniqueName: \"kubernetes.io/projected/d34374ef-f967-4050-bca8-ec81d585ee6e-kube-api-access-p8sp9\") pod \"placement-19ad-account-create-x69kg\" (UID: \"d34374ef-f967-4050-bca8-ec81d585ee6e\") " pod="openstack/placement-19ad-account-create-x69kg" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.302592 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8sp9\" (UniqueName: \"kubernetes.io/projected/d34374ef-f967-4050-bca8-ec81d585ee6e-kube-api-access-p8sp9\") pod \"placement-19ad-account-create-x69kg\" (UID: \"d34374ef-f967-4050-bca8-ec81d585ee6e\") " pod="openstack/placement-19ad-account-create-x69kg" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.513357 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-fpb82"] Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.514383 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.517087 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-m6ktk" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.517191 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.522807 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fpb82"] Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.564943 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-19ad-account-create-x69kg" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.578930 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-de27-account-create-jl2kv"] Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.588183 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmqkg\" (UniqueName: \"kubernetes.io/projected/3209a935-b3c7-4cfd-961b-1a7550aa1f63-kube-api-access-lmqkg\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.588248 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-db-sync-config-data\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.588286 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-combined-ca-bundle\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.588353 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-config-data\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: W1006 15:11:08.646453 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod286e2cc3_89f5_4c6b_bf84_044f889676f8.slice/crio-d399073c01ff1356fe28e48d776b851803bcf7d186c306508395e7710f2724a7 WatchSource:0}: Error finding container d399073c01ff1356fe28e48d776b851803bcf7d186c306508395e7710f2724a7: Status 404 returned error can't find the container with id d399073c01ff1356fe28e48d776b851803bcf7d186c306508395e7710f2724a7 Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.689234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-combined-ca-bundle\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.689325 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-config-data\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.689424 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmqkg\" (UniqueName: \"kubernetes.io/projected/3209a935-b3c7-4cfd-961b-1a7550aa1f63-kube-api-access-lmqkg\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.689455 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-db-sync-config-data\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.693192 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-db-sync-config-data\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.696961 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-config-data\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.699047 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-combined-ca-bundle\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.716863 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmqkg\" (UniqueName: \"kubernetes.io/projected/3209a935-b3c7-4cfd-961b-1a7550aa1f63-kube-api-access-lmqkg\") pod \"glance-db-sync-fpb82\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.770671 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-r2m7t" event={"ID":"ad00dc53-32d8-4edd-ab1c-e9467d8be9eb","Type":"ContainerDied","Data":"3880a8c313687ca03f6f0968ff127f20a9333d77f4b26b733950fde207fcf14c"} Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.770930 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-r2m7t" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.770937 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3880a8c313687ca03f6f0968ff127f20a9333d77f4b26b733950fde207fcf14c" Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.772056 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-de27-account-create-jl2kv" event={"ID":"286e2cc3-89f5-4c6b-bf84-044f889676f8","Type":"ContainerStarted","Data":"d399073c01ff1356fe28e48d776b851803bcf7d186c306508395e7710f2724a7"} Oct 06 15:11:08 crc kubenswrapper[4763]: I1006 15:11:08.843283 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.078376 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-19ad-account-create-x69kg"] Oct 06 15:11:09 crc kubenswrapper[4763]: W1006 15:11:09.137515 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd34374ef_f967_4050_bca8_ec81d585ee6e.slice/crio-3c5308817081afc92cea3e3a6faa4b8b9eae1ea4475733dd80d484b3c9dec35b WatchSource:0}: Error finding container 3c5308817081afc92cea3e3a6faa4b8b9eae1ea4475733dd80d484b3c9dec35b: Status 404 returned error can't find the container with id 3c5308817081afc92cea3e3a6faa4b8b9eae1ea4475733dd80d484b3c9dec35b Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.217883 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lw4hs" podUID="2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" containerName="ovn-controller" probeResult="failure" output=< Oct 06 15:11:09 crc kubenswrapper[4763]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 06 15:11:09 crc kubenswrapper[4763]: > Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.244191 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.264223 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.374719 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fpb82"] Oct 06 15:11:09 crc kubenswrapper[4763]: W1006 15:11:09.389346 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3209a935_b3c7_4cfd_961b_1a7550aa1f63.slice/crio-bfb3056184b7211178e943f234ec9d52be997fede82335112f011715af6fdf3d WatchSource:0}: Error finding container bfb3056184b7211178e943f234ec9d52be997fede82335112f011715af6fdf3d: Status 404 returned error can't find the container with id bfb3056184b7211178e943f234ec9d52be997fede82335112f011715af6fdf3d Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.486464 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lw4hs-config-m6qrg"] Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.487562 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.489930 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.497414 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lw4hs-config-m6qrg"] Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.505367 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrmqd\" (UniqueName: \"kubernetes.io/projected/3795d77d-5050-4383-bfab-e66593691b73-kube-api-access-rrmqd\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.505404 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run-ovn\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.505478 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-scripts\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.505496 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.505521 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-log-ovn\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.505543 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-additional-scripts\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.607060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.607154 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-log-ovn\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.607210 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-additional-scripts\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.607300 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrmqd\" (UniqueName: \"kubernetes.io/projected/3795d77d-5050-4383-bfab-e66593691b73-kube-api-access-rrmqd\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.607345 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run-ovn\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.607399 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.607425 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-log-ovn\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.607495 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run-ovn\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.607656 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-scripts\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.608181 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-additional-scripts\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.610677 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-scripts\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.629274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrmqd\" (UniqueName: \"kubernetes.io/projected/3795d77d-5050-4383-bfab-e66593691b73-kube-api-access-rrmqd\") pod \"ovn-controller-lw4hs-config-m6qrg\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.779899 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fpb82" event={"ID":"3209a935-b3c7-4cfd-961b-1a7550aa1f63","Type":"ContainerStarted","Data":"bfb3056184b7211178e943f234ec9d52be997fede82335112f011715af6fdf3d"} Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.781539 4763 generic.go:334] "Generic (PLEG): container finished" podID="5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" containerID="e44c7c112f2e0d0aa4ddbe5721b5449d3464205575c5a4821512c86f3f926b10" exitCode=0 Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.781587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43","Type":"ContainerDied","Data":"e44c7c112f2e0d0aa4ddbe5721b5449d3464205575c5a4821512c86f3f926b10"} Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.792494 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54"} Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.792544 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900"} Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.792558 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315"} Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.792570 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1"} Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.795751 4763 generic.go:334] "Generic (PLEG): container finished" podID="d34374ef-f967-4050-bca8-ec81d585ee6e" containerID="1e4d1491bc56ac192201b7ad02f9fb5fc81651e1d4b719423ea1e29ac06c157c" exitCode=0 Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.795814 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-19ad-account-create-x69kg" event={"ID":"d34374ef-f967-4050-bca8-ec81d585ee6e","Type":"ContainerDied","Data":"1e4d1491bc56ac192201b7ad02f9fb5fc81651e1d4b719423ea1e29ac06c157c"} Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.795847 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-19ad-account-create-x69kg" event={"ID":"d34374ef-f967-4050-bca8-ec81d585ee6e","Type":"ContainerStarted","Data":"3c5308817081afc92cea3e3a6faa4b8b9eae1ea4475733dd80d484b3c9dec35b"} Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.798035 4763 generic.go:334] "Generic (PLEG): container finished" podID="286e2cc3-89f5-4c6b-bf84-044f889676f8" containerID="05bf3caedcaa68cbf017d60a64fb60ae8fb7231b8d15fab4b0d6608e04dd0284" exitCode=0 Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.798085 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-de27-account-create-jl2kv" event={"ID":"286e2cc3-89f5-4c6b-bf84-044f889676f8","Type":"ContainerDied","Data":"05bf3caedcaa68cbf017d60a64fb60ae8fb7231b8d15fab4b0d6608e04dd0284"} Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.800775 4763 generic.go:334] "Generic (PLEG): container finished" podID="2fad9bbe-33dc-4f1d-a156-52bbd3a69273" containerID="afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76" exitCode=0 Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.801037 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2fad9bbe-33dc-4f1d-a156-52bbd3a69273","Type":"ContainerDied","Data":"afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76"} Oct 06 15:11:09 crc kubenswrapper[4763]: I1006 15:11:09.834520 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:10 crc kubenswrapper[4763]: I1006 15:11:10.193951 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lw4hs-config-m6qrg"] Oct 06 15:11:10 crc kubenswrapper[4763]: W1006 15:11:10.207845 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3795d77d_5050_4383_bfab_e66593691b73.slice/crio-e6956068a76f0750e4e01a789f84ca917dbd6f79bf325638fbea35c91578d3e4 WatchSource:0}: Error finding container e6956068a76f0750e4e01a789f84ca917dbd6f79bf325638fbea35c91578d3e4: Status 404 returned error can't find the container with id e6956068a76f0750e4e01a789f84ca917dbd6f79bf325638fbea35c91578d3e4 Oct 06 15:11:10 crc kubenswrapper[4763]: I1006 15:11:10.813580 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2fad9bbe-33dc-4f1d-a156-52bbd3a69273","Type":"ContainerStarted","Data":"32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965"} Oct 06 15:11:10 crc kubenswrapper[4763]: I1006 15:11:10.814066 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:11:10 crc kubenswrapper[4763]: I1006 15:11:10.818153 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43","Type":"ContainerStarted","Data":"9dd278deff6663bb1f284e72573e37f876ea35fa390fa6b2c8c631abd30f4c75"} Oct 06 15:11:10 crc kubenswrapper[4763]: I1006 15:11:10.818431 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 15:11:10 crc kubenswrapper[4763]: I1006 15:11:10.820783 4763 generic.go:334] "Generic (PLEG): container finished" podID="3795d77d-5050-4383-bfab-e66593691b73" containerID="e2dc15d06d499d02fb313b5e2fbbff8faaf5bdca3ae7c4532ae7f40c987017e1" exitCode=0 Oct 06 15:11:10 crc kubenswrapper[4763]: I1006 15:11:10.820999 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lw4hs-config-m6qrg" event={"ID":"3795d77d-5050-4383-bfab-e66593691b73","Type":"ContainerDied","Data":"e2dc15d06d499d02fb313b5e2fbbff8faaf5bdca3ae7c4532ae7f40c987017e1"} Oct 06 15:11:10 crc kubenswrapper[4763]: I1006 15:11:10.821046 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lw4hs-config-m6qrg" event={"ID":"3795d77d-5050-4383-bfab-e66593691b73","Type":"ContainerStarted","Data":"e6956068a76f0750e4e01a789f84ca917dbd6f79bf325638fbea35c91578d3e4"} Oct 06 15:11:10 crc kubenswrapper[4763]: I1006 15:11:10.838285 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.626799685 podStartE2EDuration="57.838265924s" podCreationTimestamp="2025-10-06 15:10:13 +0000 UTC" firstStartedPulling="2025-10-06 15:10:26.565560444 +0000 UTC m=+1023.720852956" lastFinishedPulling="2025-10-06 15:10:34.777026683 +0000 UTC m=+1031.932319195" observedRunningTime="2025-10-06 15:11:10.8315753 +0000 UTC m=+1067.986867812" watchObservedRunningTime="2025-10-06 15:11:10.838265924 +0000 UTC m=+1067.993558436" Oct 06 15:11:10 crc kubenswrapper[4763]: I1006 15:11:10.873134 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.122230963 podStartE2EDuration="57.873114375s" podCreationTimestamp="2025-10-06 15:10:13 +0000 UTC" firstStartedPulling="2025-10-06 15:10:26.84529913 +0000 UTC m=+1024.000591642" lastFinishedPulling="2025-10-06 15:10:34.596182542 +0000 UTC m=+1031.751475054" observedRunningTime="2025-10-06 15:11:10.868884588 +0000 UTC m=+1068.024177100" watchObservedRunningTime="2025-10-06 15:11:10.873114375 +0000 UTC m=+1068.028406887" Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.254669 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-19ad-account-create-x69kg" Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.262848 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-de27-account-create-jl2kv" Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.343193 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq8wd\" (UniqueName: \"kubernetes.io/projected/286e2cc3-89f5-4c6b-bf84-044f889676f8-kube-api-access-sq8wd\") pod \"286e2cc3-89f5-4c6b-bf84-044f889676f8\" (UID: \"286e2cc3-89f5-4c6b-bf84-044f889676f8\") " Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.343254 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8sp9\" (UniqueName: \"kubernetes.io/projected/d34374ef-f967-4050-bca8-ec81d585ee6e-kube-api-access-p8sp9\") pod \"d34374ef-f967-4050-bca8-ec81d585ee6e\" (UID: \"d34374ef-f967-4050-bca8-ec81d585ee6e\") " Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.350380 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/286e2cc3-89f5-4c6b-bf84-044f889676f8-kube-api-access-sq8wd" (OuterVolumeSpecName: "kube-api-access-sq8wd") pod "286e2cc3-89f5-4c6b-bf84-044f889676f8" (UID: "286e2cc3-89f5-4c6b-bf84-044f889676f8"). InnerVolumeSpecName "kube-api-access-sq8wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.352755 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34374ef-f967-4050-bca8-ec81d585ee6e-kube-api-access-p8sp9" (OuterVolumeSpecName: "kube-api-access-p8sp9") pod "d34374ef-f967-4050-bca8-ec81d585ee6e" (UID: "d34374ef-f967-4050-bca8-ec81d585ee6e"). InnerVolumeSpecName "kube-api-access-p8sp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.445332 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq8wd\" (UniqueName: \"kubernetes.io/projected/286e2cc3-89f5-4c6b-bf84-044f889676f8-kube-api-access-sq8wd\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.445360 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8sp9\" (UniqueName: \"kubernetes.io/projected/d34374ef-f967-4050-bca8-ec81d585ee6e-kube-api-access-p8sp9\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.832995 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a"} Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.833921 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c"} Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.833961 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d"} Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.833974 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26"} Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.835682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-19ad-account-create-x69kg" event={"ID":"d34374ef-f967-4050-bca8-ec81d585ee6e","Type":"ContainerDied","Data":"3c5308817081afc92cea3e3a6faa4b8b9eae1ea4475733dd80d484b3c9dec35b"} Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.835717 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c5308817081afc92cea3e3a6faa4b8b9eae1ea4475733dd80d484b3c9dec35b" Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.835776 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-19ad-account-create-x69kg" Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.838481 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-de27-account-create-jl2kv" Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.838514 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-de27-account-create-jl2kv" event={"ID":"286e2cc3-89f5-4c6b-bf84-044f889676f8","Type":"ContainerDied","Data":"d399073c01ff1356fe28e48d776b851803bcf7d186c306508395e7710f2724a7"} Oct 06 15:11:11 crc kubenswrapper[4763]: I1006 15:11:11.838541 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d399073c01ff1356fe28e48d776b851803bcf7d186c306508395e7710f2724a7" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.096578 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.155418 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-additional-scripts\") pod \"3795d77d-5050-4383-bfab-e66593691b73\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.155471 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-scripts\") pod \"3795d77d-5050-4383-bfab-e66593691b73\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.155522 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run-ovn\") pod \"3795d77d-5050-4383-bfab-e66593691b73\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.155546 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrmqd\" (UniqueName: \"kubernetes.io/projected/3795d77d-5050-4383-bfab-e66593691b73-kube-api-access-rrmqd\") pod \"3795d77d-5050-4383-bfab-e66593691b73\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.155605 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-log-ovn\") pod \"3795d77d-5050-4383-bfab-e66593691b73\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.155692 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3795d77d-5050-4383-bfab-e66593691b73" (UID: "3795d77d-5050-4383-bfab-e66593691b73"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.156045 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run" (OuterVolumeSpecName: "var-run") pod "3795d77d-5050-4383-bfab-e66593691b73" (UID: "3795d77d-5050-4383-bfab-e66593691b73"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.156023 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run\") pod \"3795d77d-5050-4383-bfab-e66593691b73\" (UID: \"3795d77d-5050-4383-bfab-e66593691b73\") " Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.155714 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3795d77d-5050-4383-bfab-e66593691b73" (UID: "3795d77d-5050-4383-bfab-e66593691b73"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.156306 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3795d77d-5050-4383-bfab-e66593691b73" (UID: "3795d77d-5050-4383-bfab-e66593691b73"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.156575 4763 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.156599 4763 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.156675 4763 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.156690 4763 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3795d77d-5050-4383-bfab-e66593691b73-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.156988 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-scripts" (OuterVolumeSpecName: "scripts") pod "3795d77d-5050-4383-bfab-e66593691b73" (UID: "3795d77d-5050-4383-bfab-e66593691b73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.160149 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3795d77d-5050-4383-bfab-e66593691b73-kube-api-access-rrmqd" (OuterVolumeSpecName: "kube-api-access-rrmqd") pod "3795d77d-5050-4383-bfab-e66593691b73" (UID: "3795d77d-5050-4383-bfab-e66593691b73"). InnerVolumeSpecName "kube-api-access-rrmqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.258752 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3795d77d-5050-4383-bfab-e66593691b73-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.258793 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrmqd\" (UniqueName: \"kubernetes.io/projected/3795d77d-5050-4383-bfab-e66593691b73-kube-api-access-rrmqd\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.864041 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038"} Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.866411 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lw4hs-config-m6qrg" event={"ID":"3795d77d-5050-4383-bfab-e66593691b73","Type":"ContainerDied","Data":"e6956068a76f0750e4e01a789f84ca917dbd6f79bf325638fbea35c91578d3e4"} Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.866461 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6956068a76f0750e4e01a789f84ca917dbd6f79bf325638fbea35c91578d3e4" Oct 06 15:11:12 crc kubenswrapper[4763]: I1006 15:11:12.866506 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lw4hs-config-m6qrg" Oct 06 15:11:13 crc kubenswrapper[4763]: I1006 15:11:13.216728 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lw4hs-config-m6qrg"] Oct 06 15:11:13 crc kubenswrapper[4763]: I1006 15:11:13.221743 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lw4hs-config-m6qrg"] Oct 06 15:11:13 crc kubenswrapper[4763]: I1006 15:11:13.598482 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3795d77d-5050-4383-bfab-e66593691b73" path="/var/lib/kubelet/pods/3795d77d-5050-4383-bfab-e66593691b73/volumes" Oct 06 15:11:13 crc kubenswrapper[4763]: I1006 15:11:13.893372 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f"} Oct 06 15:11:13 crc kubenswrapper[4763]: I1006 15:11:13.893414 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c"} Oct 06 15:11:13 crc kubenswrapper[4763]: I1006 15:11:13.893423 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83"} Oct 06 15:11:13 crc kubenswrapper[4763]: I1006 15:11:13.893432 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191"} Oct 06 15:11:13 crc kubenswrapper[4763]: I1006 15:11:13.893491 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9"} Oct 06 15:11:14 crc kubenswrapper[4763]: I1006 15:11:14.214224 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lw4hs" Oct 06 15:11:14 crc kubenswrapper[4763]: I1006 15:11:14.908563 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerStarted","Data":"45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1"} Oct 06 15:11:14 crc kubenswrapper[4763]: I1006 15:11:14.946827 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.838641367 podStartE2EDuration="25.946809191s" podCreationTimestamp="2025-10-06 15:10:49 +0000 UTC" firstStartedPulling="2025-10-06 15:11:07.557024188 +0000 UTC m=+1064.712316700" lastFinishedPulling="2025-10-06 15:11:12.665192012 +0000 UTC m=+1069.820484524" observedRunningTime="2025-10-06 15:11:14.941122344 +0000 UTC m=+1072.096414856" watchObservedRunningTime="2025-10-06 15:11:14.946809191 +0000 UTC m=+1072.102101723" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.212252 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-sj596"] Oct 06 15:11:15 crc kubenswrapper[4763]: E1006 15:11:15.212629 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="286e2cc3-89f5-4c6b-bf84-044f889676f8" containerName="mariadb-account-create" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.212649 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="286e2cc3-89f5-4c6b-bf84-044f889676f8" containerName="mariadb-account-create" Oct 06 15:11:15 crc kubenswrapper[4763]: E1006 15:11:15.212685 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3795d77d-5050-4383-bfab-e66593691b73" containerName="ovn-config" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.212694 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3795d77d-5050-4383-bfab-e66593691b73" containerName="ovn-config" Oct 06 15:11:15 crc kubenswrapper[4763]: E1006 15:11:15.212716 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34374ef-f967-4050-bca8-ec81d585ee6e" containerName="mariadb-account-create" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.212725 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34374ef-f967-4050-bca8-ec81d585ee6e" containerName="mariadb-account-create" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.212937 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34374ef-f967-4050-bca8-ec81d585ee6e" containerName="mariadb-account-create" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.212980 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3795d77d-5050-4383-bfab-e66593691b73" containerName="ovn-config" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.212998 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="286e2cc3-89f5-4c6b-bf84-044f889676f8" containerName="mariadb-account-create" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.215680 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.228717 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.229990 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-sj596"] Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.306711 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.306787 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbns\" (UniqueName: \"kubernetes.io/projected/35990813-4948-4085-b754-58e29f1f89c3-kube-api-access-xfbns\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.307013 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-config\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.307145 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.307300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.307330 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.408458 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.408595 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.408643 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.408685 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.408712 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbns\" (UniqueName: \"kubernetes.io/projected/35990813-4948-4085-b754-58e29f1f89c3-kube-api-access-xfbns\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.408779 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-config\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.409511 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.414420 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.414420 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-config\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.414554 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.414573 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.429563 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbns\" (UniqueName: \"kubernetes.io/projected/35990813-4948-4085-b754-58e29f1f89c3-kube-api-access-xfbns\") pod \"dnsmasq-dns-77585f5f8c-sj596\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:15 crc kubenswrapper[4763]: I1006 15:11:15.538376 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:24 crc kubenswrapper[4763]: I1006 15:11:24.224840 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-sj596"] Oct 06 15:11:24 crc kubenswrapper[4763]: I1006 15:11:24.556815 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:11:24 crc kubenswrapper[4763]: I1006 15:11:24.807845 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 15:11:24 crc kubenswrapper[4763]: I1006 15:11:24.997056 4763 generic.go:334] "Generic (PLEG): container finished" podID="35990813-4948-4085-b754-58e29f1f89c3" containerID="c66b97056318e210f006c862a966562e2ccd023d40741fa72a11c33d161518e6" exitCode=0 Oct 06 15:11:24 crc kubenswrapper[4763]: I1006 15:11:24.997141 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" event={"ID":"35990813-4948-4085-b754-58e29f1f89c3","Type":"ContainerDied","Data":"c66b97056318e210f006c862a966562e2ccd023d40741fa72a11c33d161518e6"} Oct 06 15:11:24 crc kubenswrapper[4763]: I1006 15:11:24.997168 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" event={"ID":"35990813-4948-4085-b754-58e29f1f89c3","Type":"ContainerStarted","Data":"6d0868553ed14d3458d7fc6d9c7bc15b5ccbc06978a725af0f97809a8767e272"} Oct 06 15:11:24 crc kubenswrapper[4763]: I1006 15:11:24.998983 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fpb82" event={"ID":"3209a935-b3c7-4cfd-961b-1a7550aa1f63","Type":"ContainerStarted","Data":"fd0ecba7a3ee2a477705590263aa7e4b257d48c0ca869f26bf98cf7458fbfbb0"} Oct 06 15:11:25 crc kubenswrapper[4763]: I1006 15:11:25.038508 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-fpb82" podStartSLOduration=2.6142548310000002 podStartE2EDuration="17.038486073s" podCreationTimestamp="2025-10-06 15:11:08 +0000 UTC" firstStartedPulling="2025-10-06 15:11:09.391453242 +0000 UTC m=+1066.546745754" lastFinishedPulling="2025-10-06 15:11:23.815684494 +0000 UTC m=+1080.970976996" observedRunningTime="2025-10-06 15:11:25.035706977 +0000 UTC m=+1082.190999499" watchObservedRunningTime="2025-10-06 15:11:25.038486073 +0000 UTC m=+1082.193778585" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.008716 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" event={"ID":"35990813-4948-4085-b754-58e29f1f89c3","Type":"ContainerStarted","Data":"f71af9f4b732e58384f766e8873c8757dc3c2ead7944e432f655225af6e9404f"} Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.009025 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.042256 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" podStartSLOduration=11.042238966 podStartE2EDuration="11.042238966s" podCreationTimestamp="2025-10-06 15:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:26.03222316 +0000 UTC m=+1083.187515682" watchObservedRunningTime="2025-10-06 15:11:26.042238966 +0000 UTC m=+1083.197531478" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.280402 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gngw9"] Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.281547 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gngw9" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.289705 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gngw9"] Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.377723 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qrnms"] Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.378713 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qrnms" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.394228 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdwn5\" (UniqueName: \"kubernetes.io/projected/51d2f5e0-56bc-4feb-8ebb-8347184ebe3a-kube-api-access-jdwn5\") pod \"cinder-db-create-gngw9\" (UID: \"51d2f5e0-56bc-4feb-8ebb-8347184ebe3a\") " pod="openstack/cinder-db-create-gngw9" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.398949 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qrnms"] Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.495555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwn5\" (UniqueName: \"kubernetes.io/projected/51d2f5e0-56bc-4feb-8ebb-8347184ebe3a-kube-api-access-jdwn5\") pod \"cinder-db-create-gngw9\" (UID: \"51d2f5e0-56bc-4feb-8ebb-8347184ebe3a\") " pod="openstack/cinder-db-create-gngw9" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.495676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkcrm\" (UniqueName: \"kubernetes.io/projected/1561ea6e-3c44-48a1-a019-eb349a46c150-kube-api-access-mkcrm\") pod \"barbican-db-create-qrnms\" (UID: \"1561ea6e-3c44-48a1-a019-eb349a46c150\") " pod="openstack/barbican-db-create-qrnms" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.524692 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwn5\" (UniqueName: \"kubernetes.io/projected/51d2f5e0-56bc-4feb-8ebb-8347184ebe3a-kube-api-access-jdwn5\") pod \"cinder-db-create-gngw9\" (UID: \"51d2f5e0-56bc-4feb-8ebb-8347184ebe3a\") " pod="openstack/cinder-db-create-gngw9" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.543963 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2hzxx"] Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.544931 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.548179 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.548383 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxtw6" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.548643 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.548788 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.564694 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2hzxx"] Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.597027 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkcrm\" (UniqueName: \"kubernetes.io/projected/1561ea6e-3c44-48a1-a019-eb349a46c150-kube-api-access-mkcrm\") pod \"barbican-db-create-qrnms\" (UID: \"1561ea6e-3c44-48a1-a019-eb349a46c150\") " pod="openstack/barbican-db-create-qrnms" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.598430 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-config-data\") pod \"keystone-db-sync-2hzxx\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.598583 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qnhf\" (UniqueName: \"kubernetes.io/projected/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-kube-api-access-9qnhf\") pod \"keystone-db-sync-2hzxx\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.598781 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-combined-ca-bundle\") pod \"keystone-db-sync-2hzxx\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.614661 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkcrm\" (UniqueName: \"kubernetes.io/projected/1561ea6e-3c44-48a1-a019-eb349a46c150-kube-api-access-mkcrm\") pod \"barbican-db-create-qrnms\" (UID: \"1561ea6e-3c44-48a1-a019-eb349a46c150\") " pod="openstack/barbican-db-create-qrnms" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.619602 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gngw9" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.682063 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fkvpx"] Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.686282 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fkvpx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.693922 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fkvpx"] Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.697790 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qrnms" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.701053 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-config-data\") pod \"keystone-db-sync-2hzxx\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.701098 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnhf\" (UniqueName: \"kubernetes.io/projected/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-kube-api-access-9qnhf\") pod \"keystone-db-sync-2hzxx\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.701398 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-combined-ca-bundle\") pod \"keystone-db-sync-2hzxx\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.706890 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-config-data\") pod \"keystone-db-sync-2hzxx\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.728290 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-combined-ca-bundle\") pod \"keystone-db-sync-2hzxx\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.730920 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qnhf\" (UniqueName: \"kubernetes.io/projected/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-kube-api-access-9qnhf\") pod \"keystone-db-sync-2hzxx\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.802904 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq7p6\" (UniqueName: \"kubernetes.io/projected/25243974-29ff-4bcb-854c-a72b5901b0bf-kube-api-access-gq7p6\") pod \"neutron-db-create-fkvpx\" (UID: \"25243974-29ff-4bcb-854c-a72b5901b0bf\") " pod="openstack/neutron-db-create-fkvpx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.872386 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.904014 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq7p6\" (UniqueName: \"kubernetes.io/projected/25243974-29ff-4bcb-854c-a72b5901b0bf-kube-api-access-gq7p6\") pod \"neutron-db-create-fkvpx\" (UID: \"25243974-29ff-4bcb-854c-a72b5901b0bf\") " pod="openstack/neutron-db-create-fkvpx" Oct 06 15:11:26 crc kubenswrapper[4763]: I1006 15:11:26.929425 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq7p6\" (UniqueName: \"kubernetes.io/projected/25243974-29ff-4bcb-854c-a72b5901b0bf-kube-api-access-gq7p6\") pod \"neutron-db-create-fkvpx\" (UID: \"25243974-29ff-4bcb-854c-a72b5901b0bf\") " pod="openstack/neutron-db-create-fkvpx" Oct 06 15:11:27 crc kubenswrapper[4763]: I1006 15:11:27.057556 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qrnms"] Oct 06 15:11:27 crc kubenswrapper[4763]: I1006 15:11:27.106991 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gngw9"] Oct 06 15:11:27 crc kubenswrapper[4763]: I1006 15:11:27.107270 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fkvpx" Oct 06 15:11:27 crc kubenswrapper[4763]: I1006 15:11:27.322024 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fkvpx"] Oct 06 15:11:27 crc kubenswrapper[4763]: W1006 15:11:27.328680 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25243974_29ff_4bcb_854c_a72b5901b0bf.slice/crio-c74dba013e7f4655308ebb904c7db2df63d772d4a772292b337fdc4a16900ad8 WatchSource:0}: Error finding container c74dba013e7f4655308ebb904c7db2df63d772d4a772292b337fdc4a16900ad8: Status 404 returned error can't find the container with id c74dba013e7f4655308ebb904c7db2df63d772d4a772292b337fdc4a16900ad8 Oct 06 15:11:27 crc kubenswrapper[4763]: I1006 15:11:27.339016 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2hzxx"] Oct 06 15:11:28 crc kubenswrapper[4763]: I1006 15:11:28.022693 4763 generic.go:334] "Generic (PLEG): container finished" podID="25243974-29ff-4bcb-854c-a72b5901b0bf" containerID="c3caa7ea290d288d2300056350ea320341cdbe1c53cd822fd83f4992c1443266" exitCode=0 Oct 06 15:11:28 crc kubenswrapper[4763]: I1006 15:11:28.022758 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fkvpx" event={"ID":"25243974-29ff-4bcb-854c-a72b5901b0bf","Type":"ContainerDied","Data":"c3caa7ea290d288d2300056350ea320341cdbe1c53cd822fd83f4992c1443266"} Oct 06 15:11:28 crc kubenswrapper[4763]: I1006 15:11:28.023064 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fkvpx" event={"ID":"25243974-29ff-4bcb-854c-a72b5901b0bf","Type":"ContainerStarted","Data":"c74dba013e7f4655308ebb904c7db2df63d772d4a772292b337fdc4a16900ad8"} Oct 06 15:11:28 crc kubenswrapper[4763]: I1006 15:11:28.025145 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2hzxx" event={"ID":"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c","Type":"ContainerStarted","Data":"bbe4e3af7cd3ab9da21f3a19fd58e42f3c105b7b1fffc6d8aa687f5f0f20de3b"} Oct 06 15:11:28 crc kubenswrapper[4763]: I1006 15:11:28.026924 4763 generic.go:334] "Generic (PLEG): container finished" podID="51d2f5e0-56bc-4feb-8ebb-8347184ebe3a" containerID="b3e95b1adef60fbddee41bd6936cab65de9301c6fd3e6e266eded3450d295996" exitCode=0 Oct 06 15:11:28 crc kubenswrapper[4763]: I1006 15:11:28.026965 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gngw9" event={"ID":"51d2f5e0-56bc-4feb-8ebb-8347184ebe3a","Type":"ContainerDied","Data":"b3e95b1adef60fbddee41bd6936cab65de9301c6fd3e6e266eded3450d295996"} Oct 06 15:11:28 crc kubenswrapper[4763]: I1006 15:11:28.026980 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gngw9" event={"ID":"51d2f5e0-56bc-4feb-8ebb-8347184ebe3a","Type":"ContainerStarted","Data":"0ba956b2e076381932fd3ef90d5388aa78a0d3036a513a937a049fb9ee655335"} Oct 06 15:11:28 crc kubenswrapper[4763]: I1006 15:11:28.027929 4763 generic.go:334] "Generic (PLEG): container finished" podID="1561ea6e-3c44-48a1-a019-eb349a46c150" containerID="6e894010adcea5c116cbd4e38f6bdd3832ca519307d5338247f59df695912ae0" exitCode=0 Oct 06 15:11:28 crc kubenswrapper[4763]: I1006 15:11:28.027952 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qrnms" event={"ID":"1561ea6e-3c44-48a1-a019-eb349a46c150","Type":"ContainerDied","Data":"6e894010adcea5c116cbd4e38f6bdd3832ca519307d5338247f59df695912ae0"} Oct 06 15:11:28 crc kubenswrapper[4763]: I1006 15:11:28.027964 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qrnms" event={"ID":"1561ea6e-3c44-48a1-a019-eb349a46c150","Type":"ContainerStarted","Data":"52ab9b8919e3081d5f8f199045b9ec2350478bad77c40269c89b9fdaba81be5f"} Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.459289 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qrnms" Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.468243 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gngw9" Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.489538 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fkvpx" Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.558747 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq7p6\" (UniqueName: \"kubernetes.io/projected/25243974-29ff-4bcb-854c-a72b5901b0bf-kube-api-access-gq7p6\") pod \"25243974-29ff-4bcb-854c-a72b5901b0bf\" (UID: \"25243974-29ff-4bcb-854c-a72b5901b0bf\") " Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.558863 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdwn5\" (UniqueName: \"kubernetes.io/projected/51d2f5e0-56bc-4feb-8ebb-8347184ebe3a-kube-api-access-jdwn5\") pod \"51d2f5e0-56bc-4feb-8ebb-8347184ebe3a\" (UID: \"51d2f5e0-56bc-4feb-8ebb-8347184ebe3a\") " Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.558916 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkcrm\" (UniqueName: \"kubernetes.io/projected/1561ea6e-3c44-48a1-a019-eb349a46c150-kube-api-access-mkcrm\") pod \"1561ea6e-3c44-48a1-a019-eb349a46c150\" (UID: \"1561ea6e-3c44-48a1-a019-eb349a46c150\") " Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.569077 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25243974-29ff-4bcb-854c-a72b5901b0bf-kube-api-access-gq7p6" (OuterVolumeSpecName: "kube-api-access-gq7p6") pod "25243974-29ff-4bcb-854c-a72b5901b0bf" (UID: "25243974-29ff-4bcb-854c-a72b5901b0bf"). InnerVolumeSpecName "kube-api-access-gq7p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.569963 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1561ea6e-3c44-48a1-a019-eb349a46c150-kube-api-access-mkcrm" (OuterVolumeSpecName: "kube-api-access-mkcrm") pod "1561ea6e-3c44-48a1-a019-eb349a46c150" (UID: "1561ea6e-3c44-48a1-a019-eb349a46c150"). InnerVolumeSpecName "kube-api-access-mkcrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.590519 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d2f5e0-56bc-4feb-8ebb-8347184ebe3a-kube-api-access-jdwn5" (OuterVolumeSpecName: "kube-api-access-jdwn5") pod "51d2f5e0-56bc-4feb-8ebb-8347184ebe3a" (UID: "51d2f5e0-56bc-4feb-8ebb-8347184ebe3a"). InnerVolumeSpecName "kube-api-access-jdwn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.660758 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq7p6\" (UniqueName: \"kubernetes.io/projected/25243974-29ff-4bcb-854c-a72b5901b0bf-kube-api-access-gq7p6\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.661059 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdwn5\" (UniqueName: \"kubernetes.io/projected/51d2f5e0-56bc-4feb-8ebb-8347184ebe3a-kube-api-access-jdwn5\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:29 crc kubenswrapper[4763]: I1006 15:11:29.661070 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkcrm\" (UniqueName: \"kubernetes.io/projected/1561ea6e-3c44-48a1-a019-eb349a46c150-kube-api-access-mkcrm\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.048274 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qrnms" event={"ID":"1561ea6e-3c44-48a1-a019-eb349a46c150","Type":"ContainerDied","Data":"52ab9b8919e3081d5f8f199045b9ec2350478bad77c40269c89b9fdaba81be5f"} Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.048314 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52ab9b8919e3081d5f8f199045b9ec2350478bad77c40269c89b9fdaba81be5f" Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.048368 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qrnms" Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.052650 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fkvpx" Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.053216 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fkvpx" event={"ID":"25243974-29ff-4bcb-854c-a72b5901b0bf","Type":"ContainerDied","Data":"c74dba013e7f4655308ebb904c7db2df63d772d4a772292b337fdc4a16900ad8"} Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.053249 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c74dba013e7f4655308ebb904c7db2df63d772d4a772292b337fdc4a16900ad8" Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.057414 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gngw9" event={"ID":"51d2f5e0-56bc-4feb-8ebb-8347184ebe3a","Type":"ContainerDied","Data":"0ba956b2e076381932fd3ef90d5388aa78a0d3036a513a937a049fb9ee655335"} Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.057437 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ba956b2e076381932fd3ef90d5388aa78a0d3036a513a937a049fb9ee655335" Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.057473 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gngw9" Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.543417 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.591157 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-zj5tx"] Oct 06 15:11:30 crc kubenswrapper[4763]: I1006 15:11:30.591392 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-zj5tx" podUID="2179e949-55aa-41a1-a5de-6a9be811df18" containerName="dnsmasq-dns" containerID="cri-o://52d31a65e70cf7edf041177a17133c179009219d839b7894b441830e400538c1" gracePeriod=10 Oct 06 15:11:31 crc kubenswrapper[4763]: I1006 15:11:31.070513 4763 generic.go:334] "Generic (PLEG): container finished" podID="3209a935-b3c7-4cfd-961b-1a7550aa1f63" containerID="fd0ecba7a3ee2a477705590263aa7e4b257d48c0ca869f26bf98cf7458fbfbb0" exitCode=0 Oct 06 15:11:31 crc kubenswrapper[4763]: I1006 15:11:31.070627 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fpb82" event={"ID":"3209a935-b3c7-4cfd-961b-1a7550aa1f63","Type":"ContainerDied","Data":"fd0ecba7a3ee2a477705590263aa7e4b257d48c0ca869f26bf98cf7458fbfbb0"} Oct 06 15:11:31 crc kubenswrapper[4763]: I1006 15:11:31.074689 4763 generic.go:334] "Generic (PLEG): container finished" podID="2179e949-55aa-41a1-a5de-6a9be811df18" containerID="52d31a65e70cf7edf041177a17133c179009219d839b7894b441830e400538c1" exitCode=0 Oct 06 15:11:31 crc kubenswrapper[4763]: I1006 15:11:31.074727 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-zj5tx" event={"ID":"2179e949-55aa-41a1-a5de-6a9be811df18","Type":"ContainerDied","Data":"52d31a65e70cf7edf041177a17133c179009219d839b7894b441830e400538c1"} Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:32.847012 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:32.926127 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-db-sync-config-data\") pod \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:32.926240 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmqkg\" (UniqueName: \"kubernetes.io/projected/3209a935-b3c7-4cfd-961b-1a7550aa1f63-kube-api-access-lmqkg\") pod \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:32.926288 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-combined-ca-bundle\") pod \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:32.926395 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-config-data\") pod \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\" (UID: \"3209a935-b3c7-4cfd-961b-1a7550aa1f63\") " Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:32.930346 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3209a935-b3c7-4cfd-961b-1a7550aa1f63-kube-api-access-lmqkg" (OuterVolumeSpecName: "kube-api-access-lmqkg") pod "3209a935-b3c7-4cfd-961b-1a7550aa1f63" (UID: "3209a935-b3c7-4cfd-961b-1a7550aa1f63"). InnerVolumeSpecName "kube-api-access-lmqkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:32.931055 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3209a935-b3c7-4cfd-961b-1a7550aa1f63" (UID: "3209a935-b3c7-4cfd-961b-1a7550aa1f63"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:32.935100 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:32.953739 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3209a935-b3c7-4cfd-961b-1a7550aa1f63" (UID: "3209a935-b3c7-4cfd-961b-1a7550aa1f63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.001830 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-config-data" (OuterVolumeSpecName: "config-data") pod "3209a935-b3c7-4cfd-961b-1a7550aa1f63" (UID: "3209a935-b3c7-4cfd-961b-1a7550aa1f63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.028672 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2tdh\" (UniqueName: \"kubernetes.io/projected/2179e949-55aa-41a1-a5de-6a9be811df18-kube-api-access-f2tdh\") pod \"2179e949-55aa-41a1-a5de-6a9be811df18\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.029268 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-sb\") pod \"2179e949-55aa-41a1-a5de-6a9be811df18\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.029309 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-config\") pod \"2179e949-55aa-41a1-a5de-6a9be811df18\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.033419 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-nb\") pod \"2179e949-55aa-41a1-a5de-6a9be811df18\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.033515 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-dns-svc\") pod \"2179e949-55aa-41a1-a5de-6a9be811df18\" (UID: \"2179e949-55aa-41a1-a5de-6a9be811df18\") " Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.034880 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2179e949-55aa-41a1-a5de-6a9be811df18-kube-api-access-f2tdh" (OuterVolumeSpecName: "kube-api-access-f2tdh") pod "2179e949-55aa-41a1-a5de-6a9be811df18" (UID: "2179e949-55aa-41a1-a5de-6a9be811df18"). InnerVolumeSpecName "kube-api-access-f2tdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.035212 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmqkg\" (UniqueName: \"kubernetes.io/projected/3209a935-b3c7-4cfd-961b-1a7550aa1f63-kube-api-access-lmqkg\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.035235 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.035247 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2tdh\" (UniqueName: \"kubernetes.io/projected/2179e949-55aa-41a1-a5de-6a9be811df18-kube-api-access-f2tdh\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.035258 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.035269 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3209a935-b3c7-4cfd-961b-1a7550aa1f63-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.078596 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2179e949-55aa-41a1-a5de-6a9be811df18" (UID: "2179e949-55aa-41a1-a5de-6a9be811df18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.079515 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2179e949-55aa-41a1-a5de-6a9be811df18" (UID: "2179e949-55aa-41a1-a5de-6a9be811df18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.080038 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2179e949-55aa-41a1-a5de-6a9be811df18" (UID: "2179e949-55aa-41a1-a5de-6a9be811df18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.087691 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-config" (OuterVolumeSpecName: "config") pod "2179e949-55aa-41a1-a5de-6a9be811df18" (UID: "2179e949-55aa-41a1-a5de-6a9be811df18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.092229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fpb82" event={"ID":"3209a935-b3c7-4cfd-961b-1a7550aa1f63","Type":"ContainerDied","Data":"bfb3056184b7211178e943f234ec9d52be997fede82335112f011715af6fdf3d"} Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.092253 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fpb82" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.092292 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfb3056184b7211178e943f234ec9d52be997fede82335112f011715af6fdf3d" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.093775 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2hzxx" event={"ID":"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c","Type":"ContainerStarted","Data":"afd96ca75e1ed3b393dc75249667fe38b163d255761c91560b0325c02a34a298"} Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.095805 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-zj5tx" event={"ID":"2179e949-55aa-41a1-a5de-6a9be811df18","Type":"ContainerDied","Data":"ab5a2a7cc451fceddb81a79c53df7770fc0cac412f91b8c0c8e5723dd2d8d749"} Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.095829 4763 scope.go:117] "RemoveContainer" containerID="52d31a65e70cf7edf041177a17133c179009219d839b7894b441830e400538c1" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.095925 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-zj5tx" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.119309 4763 scope.go:117] "RemoveContainer" containerID="ac77de8718aa9799ad9e2971d6d8f7f5f4b6beba58865e521d8e158e329848a0" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.133031 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2hzxx" podStartSLOduration=1.8227089520000002 podStartE2EDuration="7.133010558s" podCreationTimestamp="2025-10-06 15:11:26 +0000 UTC" firstStartedPulling="2025-10-06 15:11:27.358929621 +0000 UTC m=+1084.514222133" lastFinishedPulling="2025-10-06 15:11:32.669231227 +0000 UTC m=+1089.824523739" observedRunningTime="2025-10-06 15:11:33.109819479 +0000 UTC m=+1090.265111991" watchObservedRunningTime="2025-10-06 15:11:33.133010558 +0000 UTC m=+1090.288303080" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.136296 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.136318 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.136327 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.136336 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2179e949-55aa-41a1-a5de-6a9be811df18-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.142673 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-zj5tx"] Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.150860 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-zj5tx"] Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.436837 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-6cjbf"] Oct 06 15:11:33 crc kubenswrapper[4763]: E1006 15:11:33.439036 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3209a935-b3c7-4cfd-961b-1a7550aa1f63" containerName="glance-db-sync" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439054 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3209a935-b3c7-4cfd-961b-1a7550aa1f63" containerName="glance-db-sync" Oct 06 15:11:33 crc kubenswrapper[4763]: E1006 15:11:33.439063 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2179e949-55aa-41a1-a5de-6a9be811df18" containerName="dnsmasq-dns" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439069 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2179e949-55aa-41a1-a5de-6a9be811df18" containerName="dnsmasq-dns" Oct 06 15:11:33 crc kubenswrapper[4763]: E1006 15:11:33.439086 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d2f5e0-56bc-4feb-8ebb-8347184ebe3a" containerName="mariadb-database-create" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439092 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d2f5e0-56bc-4feb-8ebb-8347184ebe3a" containerName="mariadb-database-create" Oct 06 15:11:33 crc kubenswrapper[4763]: E1006 15:11:33.439104 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1561ea6e-3c44-48a1-a019-eb349a46c150" containerName="mariadb-database-create" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439109 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1561ea6e-3c44-48a1-a019-eb349a46c150" containerName="mariadb-database-create" Oct 06 15:11:33 crc kubenswrapper[4763]: E1006 15:11:33.439128 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2179e949-55aa-41a1-a5de-6a9be811df18" containerName="init" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439134 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2179e949-55aa-41a1-a5de-6a9be811df18" containerName="init" Oct 06 15:11:33 crc kubenswrapper[4763]: E1006 15:11:33.439145 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25243974-29ff-4bcb-854c-a72b5901b0bf" containerName="mariadb-database-create" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439151 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="25243974-29ff-4bcb-854c-a72b5901b0bf" containerName="mariadb-database-create" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439292 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3209a935-b3c7-4cfd-961b-1a7550aa1f63" containerName="glance-db-sync" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439308 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d2f5e0-56bc-4feb-8ebb-8347184ebe3a" containerName="mariadb-database-create" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439315 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2179e949-55aa-41a1-a5de-6a9be811df18" containerName="dnsmasq-dns" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439326 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="25243974-29ff-4bcb-854c-a72b5901b0bf" containerName="mariadb-database-create" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.439341 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1561ea6e-3c44-48a1-a019-eb349a46c150" containerName="mariadb-database-create" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.440183 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.468866 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-6cjbf"] Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.548487 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7c84\" (UniqueName: \"kubernetes.io/projected/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-kube-api-access-n7c84\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.548553 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.548628 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.548654 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.548756 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-config\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.548938 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.586466 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2179e949-55aa-41a1-a5de-6a9be811df18" path="/var/lib/kubelet/pods/2179e949-55aa-41a1-a5de-6a9be811df18/volumes" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.650084 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.650142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.650172 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.650198 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-config\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.650252 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.650304 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7c84\" (UniqueName: \"kubernetes.io/projected/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-kube-api-access-n7c84\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.651052 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.651761 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.651893 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-config\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.651936 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.652044 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.666910 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7c84\" (UniqueName: \"kubernetes.io/projected/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-kube-api-access-n7c84\") pod \"dnsmasq-dns-7ff5475cc9-6cjbf\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:33 crc kubenswrapper[4763]: I1006 15:11:33.781431 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:34 crc kubenswrapper[4763]: I1006 15:11:34.224666 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-6cjbf"] Oct 06 15:11:35 crc kubenswrapper[4763]: I1006 15:11:35.116749 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f5d7b9c-4e17-4717-95a8-b8b871205a3a" containerID="e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412" exitCode=0 Oct 06 15:11:35 crc kubenswrapper[4763]: I1006 15:11:35.116852 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" event={"ID":"4f5d7b9c-4e17-4717-95a8-b8b871205a3a","Type":"ContainerDied","Data":"e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412"} Oct 06 15:11:35 crc kubenswrapper[4763]: I1006 15:11:35.117173 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" event={"ID":"4f5d7b9c-4e17-4717-95a8-b8b871205a3a","Type":"ContainerStarted","Data":"b603f3040b76265a8b03a6c5ba5a9db3d7dad94882dd1356f111f62110f48273"} Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.144220 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" event={"ID":"4f5d7b9c-4e17-4717-95a8-b8b871205a3a","Type":"ContainerStarted","Data":"beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d"} Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.144925 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.146816 4763 generic.go:334] "Generic (PLEG): container finished" podID="fb2d7b1b-15e9-4aba-89ed-46b37e6a769c" containerID="afd96ca75e1ed3b393dc75249667fe38b163d255761c91560b0325c02a34a298" exitCode=0 Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.146861 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2hzxx" event={"ID":"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c","Type":"ContainerDied","Data":"afd96ca75e1ed3b393dc75249667fe38b163d255761c91560b0325c02a34a298"} Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.176448 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" podStartSLOduration=3.1764241699999998 podStartE2EDuration="3.17642417s" podCreationTimestamp="2025-10-06 15:11:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:36.16696801 +0000 UTC m=+1093.322260572" watchObservedRunningTime="2025-10-06 15:11:36.17642417 +0000 UTC m=+1093.331716722" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.413070 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4bac-account-create-bk65b"] Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.414298 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4bac-account-create-bk65b" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.417146 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.423234 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4bac-account-create-bk65b"] Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.495459 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5xp2\" (UniqueName: \"kubernetes.io/projected/257597c0-e6c4-40b8-9eaa-590d9caeb136-kube-api-access-h5xp2\") pod \"barbican-4bac-account-create-bk65b\" (UID: \"257597c0-e6c4-40b8-9eaa-590d9caeb136\") " pod="openstack/barbican-4bac-account-create-bk65b" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.515386 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1178-account-create-nk49j"] Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.516836 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1178-account-create-nk49j" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.519048 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.528187 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1178-account-create-nk49j"] Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.597590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzft5\" (UniqueName: \"kubernetes.io/projected/d036ff61-b8cb-4cb5-a353-206c1306f39b-kube-api-access-gzft5\") pod \"cinder-1178-account-create-nk49j\" (UID: \"d036ff61-b8cb-4cb5-a353-206c1306f39b\") " pod="openstack/cinder-1178-account-create-nk49j" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.597683 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5xp2\" (UniqueName: \"kubernetes.io/projected/257597c0-e6c4-40b8-9eaa-590d9caeb136-kube-api-access-h5xp2\") pod \"barbican-4bac-account-create-bk65b\" (UID: \"257597c0-e6c4-40b8-9eaa-590d9caeb136\") " pod="openstack/barbican-4bac-account-create-bk65b" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.616923 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5xp2\" (UniqueName: \"kubernetes.io/projected/257597c0-e6c4-40b8-9eaa-590d9caeb136-kube-api-access-h5xp2\") pod \"barbican-4bac-account-create-bk65b\" (UID: \"257597c0-e6c4-40b8-9eaa-590d9caeb136\") " pod="openstack/barbican-4bac-account-create-bk65b" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.699353 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzft5\" (UniqueName: \"kubernetes.io/projected/d036ff61-b8cb-4cb5-a353-206c1306f39b-kube-api-access-gzft5\") pod \"cinder-1178-account-create-nk49j\" (UID: \"d036ff61-b8cb-4cb5-a353-206c1306f39b\") " pod="openstack/cinder-1178-account-create-nk49j" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.703769 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-11f1-account-create-xdj7v"] Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.704832 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-11f1-account-create-xdj7v" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.707203 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.717798 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-11f1-account-create-xdj7v"] Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.729580 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzft5\" (UniqueName: \"kubernetes.io/projected/d036ff61-b8cb-4cb5-a353-206c1306f39b-kube-api-access-gzft5\") pod \"cinder-1178-account-create-nk49j\" (UID: \"d036ff61-b8cb-4cb5-a353-206c1306f39b\") " pod="openstack/cinder-1178-account-create-nk49j" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.736431 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4bac-account-create-bk65b" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.800784 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtgrb\" (UniqueName: \"kubernetes.io/projected/e3c9db67-1ae9-4eeb-8023-558bd1136960-kube-api-access-dtgrb\") pod \"neutron-11f1-account-create-xdj7v\" (UID: \"e3c9db67-1ae9-4eeb-8023-558bd1136960\") " pod="openstack/neutron-11f1-account-create-xdj7v" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.832850 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1178-account-create-nk49j" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.902005 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtgrb\" (UniqueName: \"kubernetes.io/projected/e3c9db67-1ae9-4eeb-8023-558bd1136960-kube-api-access-dtgrb\") pod \"neutron-11f1-account-create-xdj7v\" (UID: \"e3c9db67-1ae9-4eeb-8023-558bd1136960\") " pod="openstack/neutron-11f1-account-create-xdj7v" Oct 06 15:11:36 crc kubenswrapper[4763]: I1006 15:11:36.918367 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtgrb\" (UniqueName: \"kubernetes.io/projected/e3c9db67-1ae9-4eeb-8023-558bd1136960-kube-api-access-dtgrb\") pod \"neutron-11f1-account-create-xdj7v\" (UID: \"e3c9db67-1ae9-4eeb-8023-558bd1136960\") " pod="openstack/neutron-11f1-account-create-xdj7v" Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.041200 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-11f1-account-create-xdj7v" Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.243218 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4bac-account-create-bk65b"] Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.311892 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1178-account-create-nk49j"] Oct 06 15:11:37 crc kubenswrapper[4763]: W1006 15:11:37.316852 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd036ff61_b8cb_4cb5_a353_206c1306f39b.slice/crio-7c6a3430bb2a78512bb7899056bf1f96e4a96f746e5d167f31360aaaf48b803e WatchSource:0}: Error finding container 7c6a3430bb2a78512bb7899056bf1f96e4a96f746e5d167f31360aaaf48b803e: Status 404 returned error can't find the container with id 7c6a3430bb2a78512bb7899056bf1f96e4a96f746e5d167f31360aaaf48b803e Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.431071 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.486661 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-11f1-account-create-xdj7v"] Oct 06 15:11:37 crc kubenswrapper[4763]: W1006 15:11:37.487372 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c9db67_1ae9_4eeb_8023_558bd1136960.slice/crio-57a13631097856bd3be99766122e06b7600a71ff129f7e18e369f4d4be37aa52 WatchSource:0}: Error finding container 57a13631097856bd3be99766122e06b7600a71ff129f7e18e369f4d4be37aa52: Status 404 returned error can't find the container with id 57a13631097856bd3be99766122e06b7600a71ff129f7e18e369f4d4be37aa52 Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.510836 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qnhf\" (UniqueName: \"kubernetes.io/projected/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-kube-api-access-9qnhf\") pod \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.510954 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-config-data\") pod \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.510992 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-combined-ca-bundle\") pod \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\" (UID: \"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c\") " Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.516948 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-kube-api-access-9qnhf" (OuterVolumeSpecName: "kube-api-access-9qnhf") pod "fb2d7b1b-15e9-4aba-89ed-46b37e6a769c" (UID: "fb2d7b1b-15e9-4aba-89ed-46b37e6a769c"). InnerVolumeSpecName "kube-api-access-9qnhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.538889 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb2d7b1b-15e9-4aba-89ed-46b37e6a769c" (UID: "fb2d7b1b-15e9-4aba-89ed-46b37e6a769c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.577013 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-config-data" (OuterVolumeSpecName: "config-data") pod "fb2d7b1b-15e9-4aba-89ed-46b37e6a769c" (UID: "fb2d7b1b-15e9-4aba-89ed-46b37e6a769c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.613301 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qnhf\" (UniqueName: \"kubernetes.io/projected/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-kube-api-access-9qnhf\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.613336 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:37 crc kubenswrapper[4763]: I1006 15:11:37.613345 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.167559 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-11f1-account-create-xdj7v" event={"ID":"e3c9db67-1ae9-4eeb-8023-558bd1136960","Type":"ContainerStarted","Data":"380638c831a780a71f75f79db475f78294719dd4f058d1f572dc7d482dfdeec2"} Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.167989 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-11f1-account-create-xdj7v" event={"ID":"e3c9db67-1ae9-4eeb-8023-558bd1136960","Type":"ContainerStarted","Data":"57a13631097856bd3be99766122e06b7600a71ff129f7e18e369f4d4be37aa52"} Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.170054 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4bac-account-create-bk65b" event={"ID":"257597c0-e6c4-40b8-9eaa-590d9caeb136","Type":"ContainerStarted","Data":"5da2eee9cbed0720c597485e546d3ebbefe89e4dd872ff9efcb3761f1de5329d"} Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.170195 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4bac-account-create-bk65b" event={"ID":"257597c0-e6c4-40b8-9eaa-590d9caeb136","Type":"ContainerStarted","Data":"a73bc797884635726ed20d2f8d9d94d8915fa6008b485ab1eb7880a73b1e0684"} Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.172693 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1178-account-create-nk49j" event={"ID":"d036ff61-b8cb-4cb5-a353-206c1306f39b","Type":"ContainerStarted","Data":"f59e250ba8dce3b88a299b3d9f53c567b19f7f179bd84af3fffb86a318ad5368"} Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.172769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1178-account-create-nk49j" event={"ID":"d036ff61-b8cb-4cb5-a353-206c1306f39b","Type":"ContainerStarted","Data":"7c6a3430bb2a78512bb7899056bf1f96e4a96f746e5d167f31360aaaf48b803e"} Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.175822 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2hzxx" event={"ID":"fb2d7b1b-15e9-4aba-89ed-46b37e6a769c","Type":"ContainerDied","Data":"bbe4e3af7cd3ab9da21f3a19fd58e42f3c105b7b1fffc6d8aa687f5f0f20de3b"} Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.176055 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbe4e3af7cd3ab9da21f3a19fd58e42f3c105b7b1fffc6d8aa687f5f0f20de3b" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.176005 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2hzxx" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.207990 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-11f1-account-create-xdj7v" podStartSLOduration=2.207964887 podStartE2EDuration="2.207964887s" podCreationTimestamp="2025-10-06 15:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:38.197730835 +0000 UTC m=+1095.353023377" watchObservedRunningTime="2025-10-06 15:11:38.207964887 +0000 UTC m=+1095.363257419" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.238525 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-4bac-account-create-bk65b" podStartSLOduration=2.238497859 podStartE2EDuration="2.238497859s" podCreationTimestamp="2025-10-06 15:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:38.232566405 +0000 UTC m=+1095.387858957" watchObservedRunningTime="2025-10-06 15:11:38.238497859 +0000 UTC m=+1095.393790391" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.259737 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-1178-account-create-nk49j" podStartSLOduration=2.259714943 podStartE2EDuration="2.259714943s" podCreationTimestamp="2025-10-06 15:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:38.255789195 +0000 UTC m=+1095.411081737" watchObservedRunningTime="2025-10-06 15:11:38.259714943 +0000 UTC m=+1095.415007465" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.452759 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-6cjbf"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.453273 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" podUID="4f5d7b9c-4e17-4717-95a8-b8b871205a3a" containerName="dnsmasq-dns" containerID="cri-o://beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d" gracePeriod=10 Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.475100 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf"] Oct 06 15:11:38 crc kubenswrapper[4763]: E1006 15:11:38.475507 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2d7b1b-15e9-4aba-89ed-46b37e6a769c" containerName="keystone-db-sync" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.475522 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2d7b1b-15e9-4aba-89ed-46b37e6a769c" containerName="keystone-db-sync" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.475772 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2d7b1b-15e9-4aba-89ed-46b37e6a769c" containerName="keystone-db-sync" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.491578 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.525891 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xqjsv"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.527767 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.535130 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.535420 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.535514 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.535536 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.535571 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-config\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.535707 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.535818 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh7qb\" (UniqueName: \"kubernetes.io/projected/3e3f8d75-fb04-4e27-b404-9e0c172895f0-kube-api-access-qh7qb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.538817 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.538915 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.539150 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxtw6" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.541432 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.556762 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xqjsv"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.637868 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh7qb\" (UniqueName: \"kubernetes.io/projected/3e3f8d75-fb04-4e27-b404-9e0c172895f0-kube-api-access-qh7qb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.637916 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-config-data\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.637980 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-scripts\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.638000 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-credential-keys\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.638024 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.638073 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.638098 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.638124 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-config\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.638222 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-fernet-keys\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.638238 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkrch\" (UniqueName: \"kubernetes.io/projected/732c8907-25de-4ba0-a4ff-e9d70af7afbd-kube-api-access-dkrch\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.638273 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.638291 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-combined-ca-bundle\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.640880 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.642248 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.642693 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.643012 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.643374 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-config\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.680716 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh7qb\" (UniqueName: \"kubernetes.io/projected/3e3f8d75-fb04-4e27-b404-9e0c172895f0-kube-api-access-qh7qb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fnxf\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.689740 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.691810 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.695845 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.700751 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740628 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-combined-ca-bundle\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740710 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-config-data\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740766 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-log-httpd\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740787 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-scripts\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740820 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-run-httpd\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740838 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgh7b\" (UniqueName: \"kubernetes.io/projected/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-kube-api-access-mgh7b\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740859 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-scripts\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740878 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-credential-keys\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740898 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.740972 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-config-data\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.741048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-fernet-keys\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.741071 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkrch\" (UniqueName: \"kubernetes.io/projected/732c8907-25de-4ba0-a4ff-e9d70af7afbd-kube-api-access-dkrch\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.744858 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-config-data\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.748788 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-combined-ca-bundle\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.755393 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-fernet-keys\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.757550 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-scripts\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.759677 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.760166 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-credential-keys\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.778397 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkrch\" (UniqueName: \"kubernetes.io/projected/732c8907-25de-4ba0-a4ff-e9d70af7afbd-kube-api-access-dkrch\") pod \"keystone-bootstrap-xqjsv\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.825440 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.826040 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.839254 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dbwqg"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.840574 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.848675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.848739 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-log-httpd\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.848760 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-scripts\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.848806 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-run-httpd\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.848846 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgh7b\" (UniqueName: \"kubernetes.io/projected/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-kube-api-access-mgh7b\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.848908 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.848964 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-config-data\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.853994 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-run-httpd\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.854207 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-log-httpd\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.856834 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.856902 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-scripts\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.858260 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-config-data\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.861757 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ssv6p"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.863287 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.866634 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ngc52" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.866784 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.866907 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.874403 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgh7b\" (UniqueName: \"kubernetes.io/projected/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-kube-api-access-mgh7b\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.881294 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " pod="openstack/ceilometer-0" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.881362 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dbwqg"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.919795 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ssv6p"] Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.934523 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.957884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-config-data\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.957925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgxkm\" (UniqueName: \"kubernetes.io/projected/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-kube-api-access-mgxkm\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.957971 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25l8w\" (UniqueName: \"kubernetes.io/projected/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-kube-api-access-25l8w\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.957996 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.958018 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-scripts\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.958037 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.958063 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.958083 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.958128 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-combined-ca-bundle\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.958146 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-logs\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:38 crc kubenswrapper[4763]: I1006 15:11:38.958200 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-config\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059420 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-config-data\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059472 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgxkm\" (UniqueName: \"kubernetes.io/projected/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-kube-api-access-mgxkm\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059504 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25l8w\" (UniqueName: \"kubernetes.io/projected/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-kube-api-access-25l8w\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059546 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-scripts\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059591 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059641 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059661 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059719 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-combined-ca-bundle\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059734 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-logs\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.059767 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-config\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.060920 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-config\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.060980 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-logs\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.061160 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.061778 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.062193 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.062484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.066352 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-scripts\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.068777 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-config-data\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.069597 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-combined-ca-bundle\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.070905 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.082272 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgxkm\" (UniqueName: \"kubernetes.io/projected/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-kube-api-access-mgxkm\") pod \"placement-db-sync-ssv6p\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.122288 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25l8w\" (UniqueName: \"kubernetes.io/projected/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-kube-api-access-25l8w\") pod \"dnsmasq-dns-8b5c85b87-dbwqg\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.134644 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.170007 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-svc\") pod \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.170091 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-nb\") pod \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.170147 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-swift-storage-0\") pod \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.170169 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-sb\") pod \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.170222 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7c84\" (UniqueName: \"kubernetes.io/projected/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-kube-api-access-n7c84\") pod \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.170244 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-config\") pod \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\" (UID: \"4f5d7b9c-4e17-4717-95a8-b8b871205a3a\") " Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.176757 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-kube-api-access-n7c84" (OuterVolumeSpecName: "kube-api-access-n7c84") pod "4f5d7b9c-4e17-4717-95a8-b8b871205a3a" (UID: "4f5d7b9c-4e17-4717-95a8-b8b871205a3a"). InnerVolumeSpecName "kube-api-access-n7c84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.197582 4763 generic.go:334] "Generic (PLEG): container finished" podID="257597c0-e6c4-40b8-9eaa-590d9caeb136" containerID="5da2eee9cbed0720c597485e546d3ebbefe89e4dd872ff9efcb3761f1de5329d" exitCode=0 Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.197979 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4bac-account-create-bk65b" event={"ID":"257597c0-e6c4-40b8-9eaa-590d9caeb136","Type":"ContainerDied","Data":"5da2eee9cbed0720c597485e546d3ebbefe89e4dd872ff9efcb3761f1de5329d"} Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.202380 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f5d7b9c-4e17-4717-95a8-b8b871205a3a" containerID="beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d" exitCode=0 Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.202464 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.202804 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" event={"ID":"4f5d7b9c-4e17-4717-95a8-b8b871205a3a","Type":"ContainerDied","Data":"beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d"} Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.202832 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-6cjbf" event={"ID":"4f5d7b9c-4e17-4717-95a8-b8b871205a3a","Type":"ContainerDied","Data":"b603f3040b76265a8b03a6c5ba5a9db3d7dad94882dd1356f111f62110f48273"} Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.202849 4763 scope.go:117] "RemoveContainer" containerID="beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.206829 4763 generic.go:334] "Generic (PLEG): container finished" podID="d036ff61-b8cb-4cb5-a353-206c1306f39b" containerID="f59e250ba8dce3b88a299b3d9f53c567b19f7f179bd84af3fffb86a318ad5368" exitCode=0 Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.206869 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1178-account-create-nk49j" event={"ID":"d036ff61-b8cb-4cb5-a353-206c1306f39b","Type":"ContainerDied","Data":"f59e250ba8dce3b88a299b3d9f53c567b19f7f179bd84af3fffb86a318ad5368"} Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.209209 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3c9db67-1ae9-4eeb-8023-558bd1136960" containerID="380638c831a780a71f75f79db475f78294719dd4f058d1f572dc7d482dfdeec2" exitCode=0 Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.209234 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-11f1-account-create-xdj7v" event={"ID":"e3c9db67-1ae9-4eeb-8023-558bd1136960","Type":"ContainerDied","Data":"380638c831a780a71f75f79db475f78294719dd4f058d1f572dc7d482dfdeec2"} Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.216512 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4f5d7b9c-4e17-4717-95a8-b8b871205a3a" (UID: "4f5d7b9c-4e17-4717-95a8-b8b871205a3a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.252094 4763 scope.go:117] "RemoveContainer" containerID="e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.252519 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.253330 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f5d7b9c-4e17-4717-95a8-b8b871205a3a" (UID: "4f5d7b9c-4e17-4717-95a8-b8b871205a3a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.255088 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-config" (OuterVolumeSpecName: "config") pod "4f5d7b9c-4e17-4717-95a8-b8b871205a3a" (UID: "4f5d7b9c-4e17-4717-95a8-b8b871205a3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.271351 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7c84\" (UniqueName: \"kubernetes.io/projected/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-kube-api-access-n7c84\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.271378 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.271387 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.271395 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.271430 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.291992 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f5d7b9c-4e17-4717-95a8-b8b871205a3a" (UID: "4f5d7b9c-4e17-4717-95a8-b8b871205a3a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.292045 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f5d7b9c-4e17-4717-95a8-b8b871205a3a" (UID: "4f5d7b9c-4e17-4717-95a8-b8b871205a3a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.326471 4763 scope.go:117] "RemoveContainer" containerID="beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d" Oct 06 15:11:39 crc kubenswrapper[4763]: E1006 15:11:39.326874 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d\": container with ID starting with beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d not found: ID does not exist" containerID="beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.326912 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d"} err="failed to get container status \"beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d\": rpc error: code = NotFound desc = could not find container \"beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d\": container with ID starting with beee02046628911062641c8b3bae1440d8edb3d0bcb92192f7d452bba2e4b87d not found: ID does not exist" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.326937 4763 scope.go:117] "RemoveContainer" containerID="e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412" Oct 06 15:11:39 crc kubenswrapper[4763]: E1006 15:11:39.327232 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412\": container with ID starting with e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412 not found: ID does not exist" containerID="e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.327254 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412"} err="failed to get container status \"e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412\": rpc error: code = NotFound desc = could not find container \"e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412\": container with ID starting with e07fa613182eef8b6ccec489542f420e94fb084af45a4f4b587a15d3e660a412 not found: ID does not exist" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.375870 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.375894 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f5d7b9c-4e17-4717-95a8-b8b871205a3a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.392006 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf"] Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.548751 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-6cjbf"] Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.562398 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-6cjbf"] Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.614316 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f5d7b9c-4e17-4717-95a8-b8b871205a3a" path="/var/lib/kubelet/pods/4f5d7b9c-4e17-4717-95a8-b8b871205a3a/volumes" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.615131 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xqjsv"] Oct 06 15:11:39 crc kubenswrapper[4763]: W1006 15:11:39.624816 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod732c8907_25de_4ba0_a4ff_e9d70af7afbd.slice/crio-d7fe3b88fa36a5ada99aa2b7a776cc9fff76c3ed0b58585d9243f6cccb8d0678 WatchSource:0}: Error finding container d7fe3b88fa36a5ada99aa2b7a776cc9fff76c3ed0b58585d9243f6cccb8d0678: Status 404 returned error can't find the container with id d7fe3b88fa36a5ada99aa2b7a776cc9fff76c3ed0b58585d9243f6cccb8d0678 Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.626774 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:11:39 crc kubenswrapper[4763]: E1006 15:11:39.628202 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5d7b9c-4e17-4717-95a8-b8b871205a3a" containerName="dnsmasq-dns" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.628221 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5d7b9c-4e17-4717-95a8-b8b871205a3a" containerName="dnsmasq-dns" Oct 06 15:11:39 crc kubenswrapper[4763]: E1006 15:11:39.628262 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5d7b9c-4e17-4717-95a8-b8b871205a3a" containerName="init" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.628268 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5d7b9c-4e17-4717-95a8-b8b871205a3a" containerName="init" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.631189 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5d7b9c-4e17-4717-95a8-b8b871205a3a" containerName="dnsmasq-dns" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.638226 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.657164 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.657790 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-m6ktk" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.657969 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.659205 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.671223 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.679810 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.679908 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.679945 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.679987 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.680034 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-logs\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.680072 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.680117 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.680134 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk55b\" (UniqueName: \"kubernetes.io/projected/f4f96bfd-096f-447b-9d35-e962146985a7-kube-api-access-fk55b\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.686215 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.688287 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.694930 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.697896 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.698164 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 15:11:39 crc kubenswrapper[4763]: W1006 15:11:39.702158 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dd38d95_a93a_489c_8e6b_e7a1a597ad13.slice/crio-825b8fbb51dc12bddc29920049a1f9e0ff685e4e8b9bc08cc8d0f864ca4780c8 WatchSource:0}: Error finding container 825b8fbb51dc12bddc29920049a1f9e0ff685e4e8b9bc08cc8d0f864ca4780c8: Status 404 returned error can't find the container with id 825b8fbb51dc12bddc29920049a1f9e0ff685e4e8b9bc08cc8d0f864ca4780c8 Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.707553 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.781630 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.781730 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.781757 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.781798 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.781839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-logs\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.781864 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.782229 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.782267 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk55b\" (UniqueName: \"kubernetes.io/projected/f4f96bfd-096f-447b-9d35-e962146985a7-kube-api-access-fk55b\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.790343 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.791406 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.792320 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-logs\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.792437 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.795250 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.798357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.799465 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.806296 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk55b\" (UniqueName: \"kubernetes.io/projected/f4f96bfd-096f-447b-9d35-e962146985a7-kube-api-access-fk55b\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.818954 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.853435 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dbwqg"] Oct 06 15:11:39 crc kubenswrapper[4763]: W1006 15:11:39.859999 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f18dfb_e90b_4de9_bdfc_93d8a570f23a.slice/crio-533d492f2a5841d54bce55e4b46bc5976ea884c66dab3bf1bd36b936aec769c9 WatchSource:0}: Error finding container 533d492f2a5841d54bce55e4b46bc5976ea884c66dab3bf1bd36b936aec769c9: Status 404 returned error can't find the container with id 533d492f2a5841d54bce55e4b46bc5976ea884c66dab3bf1bd36b936aec769c9 Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.883666 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-logs\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.883739 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlh2s\" (UniqueName: \"kubernetes.io/projected/00b2db37-97c4-4edd-8596-67f10d9f3054-kube-api-access-rlh2s\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.883833 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.883860 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.883881 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.883919 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.883958 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.883983 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.963123 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ssv6p"] Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.985633 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlh2s\" (UniqueName: \"kubernetes.io/projected/00b2db37-97c4-4edd-8596-67f10d9f3054-kube-api-access-rlh2s\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.985783 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.985818 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.985839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.985880 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.985925 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.985954 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.986011 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-logs\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.987405 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-logs\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.987533 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.989211 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.994713 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:39 crc kubenswrapper[4763]: I1006 15:11:39.995309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.003185 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.003346 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.013115 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.018287 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlh2s\" (UniqueName: \"kubernetes.io/projected/00b2db37-97c4-4edd-8596-67f10d9f3054-kube-api-access-rlh2s\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.031916 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.042300 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.254218 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqjsv" event={"ID":"732c8907-25de-4ba0-a4ff-e9d70af7afbd","Type":"ContainerStarted","Data":"d7fe3b88fa36a5ada99aa2b7a776cc9fff76c3ed0b58585d9243f6cccb8d0678"} Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.262071 4763 generic.go:334] "Generic (PLEG): container finished" podID="3e3f8d75-fb04-4e27-b404-9e0c172895f0" containerID="1490d3a368279000d6915cdc0ae13a934987059348e55b7abea7d7a8fe8747c7" exitCode=0 Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.262175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" event={"ID":"3e3f8d75-fb04-4e27-b404-9e0c172895f0","Type":"ContainerDied","Data":"1490d3a368279000d6915cdc0ae13a934987059348e55b7abea7d7a8fe8747c7"} Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.262201 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" event={"ID":"3e3f8d75-fb04-4e27-b404-9e0c172895f0","Type":"ContainerStarted","Data":"26d643fe5a214ba638c2b8e3423b233f1ce804066c21a149a6e2e45a0bdb5b6d"} Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.264717 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" event={"ID":"84f18dfb-e90b-4de9-bdfc-93d8a570f23a","Type":"ContainerStarted","Data":"533d492f2a5841d54bce55e4b46bc5976ea884c66dab3bf1bd36b936aec769c9"} Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.272530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9dd38d95-a93a-489c-8e6b-e7a1a597ad13","Type":"ContainerStarted","Data":"825b8fbb51dc12bddc29920049a1f9e0ff685e4e8b9bc08cc8d0f864ca4780c8"} Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.276212 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ssv6p" event={"ID":"c5a00a8d-3928-4ece-9d1c-c3ca6993756b","Type":"ContainerStarted","Data":"373c008c1de01ac3389befa2207e3917e47b9fd7a4c8ab08b5e03ce4722f3695"} Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.943529 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4bac-account-create-bk65b" Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.968952 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.973372 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.978731 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1178-account-create-nk49j" Oct 06 15:11:40 crc kubenswrapper[4763]: I1006 15:11:40.979606 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-11f1-account-create-xdj7v" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.121385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh7qb\" (UniqueName: \"kubernetes.io/projected/3e3f8d75-fb04-4e27-b404-9e0c172895f0-kube-api-access-qh7qb\") pod \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.122467 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzft5\" (UniqueName: \"kubernetes.io/projected/d036ff61-b8cb-4cb5-a353-206c1306f39b-kube-api-access-gzft5\") pod \"d036ff61-b8cb-4cb5-a353-206c1306f39b\" (UID: \"d036ff61-b8cb-4cb5-a353-206c1306f39b\") " Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.122878 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-swift-storage-0\") pod \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.122958 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-nb\") pod \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.123002 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-config\") pod \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.123026 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtgrb\" (UniqueName: \"kubernetes.io/projected/e3c9db67-1ae9-4eeb-8023-558bd1136960-kube-api-access-dtgrb\") pod \"e3c9db67-1ae9-4eeb-8023-558bd1136960\" (UID: \"e3c9db67-1ae9-4eeb-8023-558bd1136960\") " Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.123111 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5xp2\" (UniqueName: \"kubernetes.io/projected/257597c0-e6c4-40b8-9eaa-590d9caeb136-kube-api-access-h5xp2\") pod \"257597c0-e6c4-40b8-9eaa-590d9caeb136\" (UID: \"257597c0-e6c4-40b8-9eaa-590d9caeb136\") " Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.123169 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-svc\") pod \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.123204 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-sb\") pod \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\" (UID: \"3e3f8d75-fb04-4e27-b404-9e0c172895f0\") " Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.128263 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257597c0-e6c4-40b8-9eaa-590d9caeb136-kube-api-access-h5xp2" (OuterVolumeSpecName: "kube-api-access-h5xp2") pod "257597c0-e6c4-40b8-9eaa-590d9caeb136" (UID: "257597c0-e6c4-40b8-9eaa-590d9caeb136"). InnerVolumeSpecName "kube-api-access-h5xp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.128894 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3f8d75-fb04-4e27-b404-9e0c172895f0-kube-api-access-qh7qb" (OuterVolumeSpecName: "kube-api-access-qh7qb") pod "3e3f8d75-fb04-4e27-b404-9e0c172895f0" (UID: "3e3f8d75-fb04-4e27-b404-9e0c172895f0"). InnerVolumeSpecName "kube-api-access-qh7qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.128998 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d036ff61-b8cb-4cb5-a353-206c1306f39b-kube-api-access-gzft5" (OuterVolumeSpecName: "kube-api-access-gzft5") pod "d036ff61-b8cb-4cb5-a353-206c1306f39b" (UID: "d036ff61-b8cb-4cb5-a353-206c1306f39b"). InnerVolumeSpecName "kube-api-access-gzft5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.129056 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c9db67-1ae9-4eeb-8023-558bd1136960-kube-api-access-dtgrb" (OuterVolumeSpecName: "kube-api-access-dtgrb") pod "e3c9db67-1ae9-4eeb-8023-558bd1136960" (UID: "e3c9db67-1ae9-4eeb-8023-558bd1136960"). InnerVolumeSpecName "kube-api-access-dtgrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.153862 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3e3f8d75-fb04-4e27-b404-9e0c172895f0" (UID: "3e3f8d75-fb04-4e27-b404-9e0c172895f0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.158706 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3e3f8d75-fb04-4e27-b404-9e0c172895f0" (UID: "3e3f8d75-fb04-4e27-b404-9e0c172895f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.161413 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3e3f8d75-fb04-4e27-b404-9e0c172895f0" (UID: "3e3f8d75-fb04-4e27-b404-9e0c172895f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.166023 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-config" (OuterVolumeSpecName: "config") pod "3e3f8d75-fb04-4e27-b404-9e0c172895f0" (UID: "3e3f8d75-fb04-4e27-b404-9e0c172895f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.166123 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e3f8d75-fb04-4e27-b404-9e0c172895f0" (UID: "3e3f8d75-fb04-4e27-b404-9e0c172895f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.225659 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.225698 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.225710 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.225721 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtgrb\" (UniqueName: \"kubernetes.io/projected/e3c9db67-1ae9-4eeb-8023-558bd1136960-kube-api-access-dtgrb\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.225734 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5xp2\" (UniqueName: \"kubernetes.io/projected/257597c0-e6c4-40b8-9eaa-590d9caeb136-kube-api-access-h5xp2\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.225745 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.225754 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e3f8d75-fb04-4e27-b404-9e0c172895f0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.225764 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh7qb\" (UniqueName: \"kubernetes.io/projected/3e3f8d75-fb04-4e27-b404-9e0c172895f0-kube-api-access-qh7qb\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.225774 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzft5\" (UniqueName: \"kubernetes.io/projected/d036ff61-b8cb-4cb5-a353-206c1306f39b-kube-api-access-gzft5\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.295737 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4bac-account-create-bk65b" event={"ID":"257597c0-e6c4-40b8-9eaa-590d9caeb136","Type":"ContainerDied","Data":"a73bc797884635726ed20d2f8d9d94d8915fa6008b485ab1eb7880a73b1e0684"} Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.295779 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a73bc797884635726ed20d2f8d9d94d8915fa6008b485ab1eb7880a73b1e0684" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.295823 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4bac-account-create-bk65b" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.299676 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqjsv" event={"ID":"732c8907-25de-4ba0-a4ff-e9d70af7afbd","Type":"ContainerStarted","Data":"213a25e22f02ae7ac149710de55e90923971454fd1c545a530dba766bc0d6bb1"} Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.303701 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1178-account-create-nk49j" event={"ID":"d036ff61-b8cb-4cb5-a353-206c1306f39b","Type":"ContainerDied","Data":"7c6a3430bb2a78512bb7899056bf1f96e4a96f746e5d167f31360aaaf48b803e"} Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.303730 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c6a3430bb2a78512bb7899056bf1f96e4a96f746e5d167f31360aaaf48b803e" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.303773 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1178-account-create-nk49j" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.311532 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" event={"ID":"3e3f8d75-fb04-4e27-b404-9e0c172895f0","Type":"ContainerDied","Data":"26d643fe5a214ba638c2b8e3423b233f1ce804066c21a149a6e2e45a0bdb5b6d"} Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.311588 4763 scope.go:117] "RemoveContainer" containerID="1490d3a368279000d6915cdc0ae13a934987059348e55b7abea7d7a8fe8747c7" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.312324 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.322361 4763 generic.go:334] "Generic (PLEG): container finished" podID="84f18dfb-e90b-4de9-bdfc-93d8a570f23a" containerID="638eff170fd262a59d2315d59828df4187b161759a932ecbb6e6672aa691dc1f" exitCode=0 Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.322446 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" event={"ID":"84f18dfb-e90b-4de9-bdfc-93d8a570f23a","Type":"ContainerDied","Data":"638eff170fd262a59d2315d59828df4187b161759a932ecbb6e6672aa691dc1f"} Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.324810 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xqjsv" podStartSLOduration=3.324796403 podStartE2EDuration="3.324796403s" podCreationTimestamp="2025-10-06 15:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:41.314646993 +0000 UTC m=+1098.469939525" watchObservedRunningTime="2025-10-06 15:11:41.324796403 +0000 UTC m=+1098.480088925" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.328693 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4f96bfd-096f-447b-9d35-e962146985a7","Type":"ContainerStarted","Data":"c7f170cde0f807500bab1588b4b6bf1e7f43342304b0671093caf761a7fc59c7"} Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.333067 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-11f1-account-create-xdj7v" event={"ID":"e3c9db67-1ae9-4eeb-8023-558bd1136960","Type":"ContainerDied","Data":"57a13631097856bd3be99766122e06b7600a71ff129f7e18e369f4d4be37aa52"} Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.333126 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57a13631097856bd3be99766122e06b7600a71ff129f7e18e369f4d4be37aa52" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.333187 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-11f1-account-create-xdj7v" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.420513 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf"] Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.428073 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fnxf"] Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.538140 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.606790 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3f8d75-fb04-4e27-b404-9e0c172895f0" path="/var/lib/kubelet/pods/3e3f8d75-fb04-4e27-b404-9e0c172895f0/volumes" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.608084 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.608107 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.715895 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:11:41 crc kubenswrapper[4763]: W1006 15:11:41.733850 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00b2db37_97c4_4edd_8596_67f10d9f3054.slice/crio-c61589dedacec4cfa0010e7ef0859b7b82f3724b9130beff64c02a8a47246a99 WatchSource:0}: Error finding container c61589dedacec4cfa0010e7ef0859b7b82f3724b9130beff64c02a8a47246a99: Status 404 returned error can't find the container with id c61589dedacec4cfa0010e7ef0859b7b82f3724b9130beff64c02a8a47246a99 Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.790068 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vs2bl"] Oct 06 15:11:41 crc kubenswrapper[4763]: E1006 15:11:41.792954 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257597c0-e6c4-40b8-9eaa-590d9caeb136" containerName="mariadb-account-create" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.792973 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="257597c0-e6c4-40b8-9eaa-590d9caeb136" containerName="mariadb-account-create" Oct 06 15:11:41 crc kubenswrapper[4763]: E1006 15:11:41.793031 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c9db67-1ae9-4eeb-8023-558bd1136960" containerName="mariadb-account-create" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.793037 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c9db67-1ae9-4eeb-8023-558bd1136960" containerName="mariadb-account-create" Oct 06 15:11:41 crc kubenswrapper[4763]: E1006 15:11:41.793053 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3f8d75-fb04-4e27-b404-9e0c172895f0" containerName="init" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.793058 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3f8d75-fb04-4e27-b404-9e0c172895f0" containerName="init" Oct 06 15:11:41 crc kubenswrapper[4763]: E1006 15:11:41.793083 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d036ff61-b8cb-4cb5-a353-206c1306f39b" containerName="mariadb-account-create" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.793091 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d036ff61-b8cb-4cb5-a353-206c1306f39b" containerName="mariadb-account-create" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.793481 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="257597c0-e6c4-40b8-9eaa-590d9caeb136" containerName="mariadb-account-create" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.793572 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3f8d75-fb04-4e27-b404-9e0c172895f0" containerName="init" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.793601 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c9db67-1ae9-4eeb-8023-558bd1136960" containerName="mariadb-account-create" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.793636 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d036ff61-b8cb-4cb5-a353-206c1306f39b" containerName="mariadb-account-create" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.794355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.796976 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.800165 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vs2bl"] Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.800855 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hvbxm" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.801092 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.942198 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-combined-ca-bundle\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.942234 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-scripts\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.942314 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-config-data\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.942351 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mnbl\" (UniqueName: \"kubernetes.io/projected/eb17b8c1-c76b-4802-aac9-daaacea9e726-kube-api-access-9mnbl\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.942390 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb17b8c1-c76b-4802-aac9-daaacea9e726-etc-machine-id\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:41 crc kubenswrapper[4763]: I1006 15:11:41.942414 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-db-sync-config-data\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.043743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb17b8c1-c76b-4802-aac9-daaacea9e726-etc-machine-id\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.043798 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-db-sync-config-data\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.043849 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb17b8c1-c76b-4802-aac9-daaacea9e726-etc-machine-id\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.043893 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-combined-ca-bundle\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.043912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-scripts\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.043943 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-config-data\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.043978 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mnbl\" (UniqueName: \"kubernetes.io/projected/eb17b8c1-c76b-4802-aac9-daaacea9e726-kube-api-access-9mnbl\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.049941 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-combined-ca-bundle\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.050660 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-db-sync-config-data\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.051030 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-scripts\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.054013 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-config-data\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.064704 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mnbl\" (UniqueName: \"kubernetes.io/projected/eb17b8c1-c76b-4802-aac9-daaacea9e726-kube-api-access-9mnbl\") pod \"cinder-db-sync-vs2bl\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.095876 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-p7v6n"] Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.097504 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.101239 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.101473 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-crhfx" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.101640 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.105391 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p7v6n"] Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.191728 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.250202 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7w4l\" (UniqueName: \"kubernetes.io/projected/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-kube-api-access-l7w4l\") pod \"neutron-db-sync-p7v6n\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.250357 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-config\") pod \"neutron-db-sync-p7v6n\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.250457 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-combined-ca-bundle\") pod \"neutron-db-sync-p7v6n\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.351463 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-combined-ca-bundle\") pod \"neutron-db-sync-p7v6n\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.351802 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w4l\" (UniqueName: \"kubernetes.io/projected/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-kube-api-access-l7w4l\") pod \"neutron-db-sync-p7v6n\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.351923 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-config\") pod \"neutron-db-sync-p7v6n\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.357431 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-combined-ca-bundle\") pod \"neutron-db-sync-p7v6n\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.357437 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-config\") pod \"neutron-db-sync-p7v6n\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.358033 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" event={"ID":"84f18dfb-e90b-4de9-bdfc-93d8a570f23a","Type":"ContainerStarted","Data":"bed4a5f7dee4e8d86257c6804ba8b89ef1c3fa69210bede580ab29b417466c14"} Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.359669 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.369973 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4f96bfd-096f-447b-9d35-e962146985a7","Type":"ContainerStarted","Data":"3cd707e58f80b4b0759ac01dc89f74602175c17cb3c29535e5725ef255825ac8"} Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.372007 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7w4l\" (UniqueName: \"kubernetes.io/projected/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-kube-api-access-l7w4l\") pod \"neutron-db-sync-p7v6n\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.374214 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00b2db37-97c4-4edd-8596-67f10d9f3054","Type":"ContainerStarted","Data":"c61589dedacec4cfa0010e7ef0859b7b82f3724b9130beff64c02a8a47246a99"} Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.383098 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" podStartSLOduration=4.383080728 podStartE2EDuration="4.383080728s" podCreationTimestamp="2025-10-06 15:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:42.376582859 +0000 UTC m=+1099.531875401" watchObservedRunningTime="2025-10-06 15:11:42.383080728 +0000 UTC m=+1099.538373240" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.426006 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.657573 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vs2bl"] Oct 06 15:11:42 crc kubenswrapper[4763]: I1006 15:11:42.885192 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p7v6n"] Oct 06 15:11:42 crc kubenswrapper[4763]: W1006 15:11:42.894469 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ec9a93_3d8c_49a0_8dcd_bb10f1270deb.slice/crio-ba7189eed299a84775c69b3750ec9675bc7ee9ab37becf8d1dd31895c50547d0 WatchSource:0}: Error finding container ba7189eed299a84775c69b3750ec9675bc7ee9ab37becf8d1dd31895c50547d0: Status 404 returned error can't find the container with id ba7189eed299a84775c69b3750ec9675bc7ee9ab37becf8d1dd31895c50547d0 Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.384317 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4f96bfd-096f-447b-9d35-e962146985a7","Type":"ContainerStarted","Data":"4922ea53dc892a721efe57df0c9604a179cbbf9ccc5a0c971c08a39425796582"} Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.384437 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f4f96bfd-096f-447b-9d35-e962146985a7" containerName="glance-log" containerID="cri-o://3cd707e58f80b4b0759ac01dc89f74602175c17cb3c29535e5725ef255825ac8" gracePeriod=30 Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.384464 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f4f96bfd-096f-447b-9d35-e962146985a7" containerName="glance-httpd" containerID="cri-o://4922ea53dc892a721efe57df0c9604a179cbbf9ccc5a0c971c08a39425796582" gracePeriod=30 Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.386160 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vs2bl" event={"ID":"eb17b8c1-c76b-4802-aac9-daaacea9e726","Type":"ContainerStarted","Data":"89efe739a070eb3a75ef7cb05c7213101ab8250e11ce9c7c361bc5f936a78e16"} Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.389198 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00b2db37-97c4-4edd-8596-67f10d9f3054","Type":"ContainerStarted","Data":"8fc1f419342b4b48292e6bcae78a7a6e69de704e7c951ffc50c1056a31f46075"} Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.389253 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00b2db37-97c4-4edd-8596-67f10d9f3054","Type":"ContainerStarted","Data":"20ffda973e5c770f7783c0a1346dfdf869480b293cb3efe0e4b0e9937060e417"} Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.389297 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="00b2db37-97c4-4edd-8596-67f10d9f3054" containerName="glance-httpd" containerID="cri-o://8fc1f419342b4b48292e6bcae78a7a6e69de704e7c951ffc50c1056a31f46075" gracePeriod=30 Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.389447 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="00b2db37-97c4-4edd-8596-67f10d9f3054" containerName="glance-log" containerID="cri-o://20ffda973e5c770f7783c0a1346dfdf869480b293cb3efe0e4b0e9937060e417" gracePeriod=30 Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.393371 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p7v6n" event={"ID":"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb","Type":"ContainerStarted","Data":"fcf6164b2d181242c9b5562ef21adf8bd1d27334b6c36d6883ad95c6af383a2e"} Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.393404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p7v6n" event={"ID":"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb","Type":"ContainerStarted","Data":"ba7189eed299a84775c69b3750ec9675bc7ee9ab37becf8d1dd31895c50547d0"} Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.450993 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.450973848 podStartE2EDuration="5.450973848s" podCreationTimestamp="2025-10-06 15:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:43.414915914 +0000 UTC m=+1100.570208456" watchObservedRunningTime="2025-10-06 15:11:43.450973848 +0000 UTC m=+1100.606266360" Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.465368 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-p7v6n" podStartSLOduration=1.465338054 podStartE2EDuration="1.465338054s" podCreationTimestamp="2025-10-06 15:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:43.450886786 +0000 UTC m=+1100.606179318" watchObservedRunningTime="2025-10-06 15:11:43.465338054 +0000 UTC m=+1100.620630576" Oct 06 15:11:43 crc kubenswrapper[4763]: I1006 15:11:43.485760 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.485742846 podStartE2EDuration="5.485742846s" podCreationTimestamp="2025-10-06 15:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:43.474069504 +0000 UTC m=+1100.629362016" watchObservedRunningTime="2025-10-06 15:11:43.485742846 +0000 UTC m=+1100.641035358" Oct 06 15:11:44 crc kubenswrapper[4763]: I1006 15:11:44.403043 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4f96bfd-096f-447b-9d35-e962146985a7" containerID="4922ea53dc892a721efe57df0c9604a179cbbf9ccc5a0c971c08a39425796582" exitCode=0 Oct 06 15:11:44 crc kubenswrapper[4763]: I1006 15:11:44.403072 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4f96bfd-096f-447b-9d35-e962146985a7" containerID="3cd707e58f80b4b0759ac01dc89f74602175c17cb3c29535e5725ef255825ac8" exitCode=143 Oct 06 15:11:44 crc kubenswrapper[4763]: I1006 15:11:44.403121 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4f96bfd-096f-447b-9d35-e962146985a7","Type":"ContainerDied","Data":"4922ea53dc892a721efe57df0c9604a179cbbf9ccc5a0c971c08a39425796582"} Oct 06 15:11:44 crc kubenswrapper[4763]: I1006 15:11:44.403162 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4f96bfd-096f-447b-9d35-e962146985a7","Type":"ContainerDied","Data":"3cd707e58f80b4b0759ac01dc89f74602175c17cb3c29535e5725ef255825ac8"} Oct 06 15:11:44 crc kubenswrapper[4763]: I1006 15:11:44.405834 4763 generic.go:334] "Generic (PLEG): container finished" podID="00b2db37-97c4-4edd-8596-67f10d9f3054" containerID="8fc1f419342b4b48292e6bcae78a7a6e69de704e7c951ffc50c1056a31f46075" exitCode=0 Oct 06 15:11:44 crc kubenswrapper[4763]: I1006 15:11:44.405855 4763 generic.go:334] "Generic (PLEG): container finished" podID="00b2db37-97c4-4edd-8596-67f10d9f3054" containerID="20ffda973e5c770f7783c0a1346dfdf869480b293cb3efe0e4b0e9937060e417" exitCode=143 Oct 06 15:11:44 crc kubenswrapper[4763]: I1006 15:11:44.405889 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00b2db37-97c4-4edd-8596-67f10d9f3054","Type":"ContainerDied","Data":"8fc1f419342b4b48292e6bcae78a7a6e69de704e7c951ffc50c1056a31f46075"} Oct 06 15:11:44 crc kubenswrapper[4763]: I1006 15:11:44.405904 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00b2db37-97c4-4edd-8596-67f10d9f3054","Type":"ContainerDied","Data":"20ffda973e5c770f7783c0a1346dfdf869480b293cb3efe0e4b0e9937060e417"} Oct 06 15:11:44 crc kubenswrapper[4763]: I1006 15:11:44.408013 4763 generic.go:334] "Generic (PLEG): container finished" podID="732c8907-25de-4ba0-a4ff-e9d70af7afbd" containerID="213a25e22f02ae7ac149710de55e90923971454fd1c545a530dba766bc0d6bb1" exitCode=0 Oct 06 15:11:44 crc kubenswrapper[4763]: I1006 15:11:44.408071 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqjsv" event={"ID":"732c8907-25de-4ba0-a4ff-e9d70af7afbd","Type":"ContainerDied","Data":"213a25e22f02ae7ac149710de55e90923971454fd1c545a530dba766bc0d6bb1"} Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.703237 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wg4dh"] Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.705167 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.707941 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.708094 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ktsj6" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.720090 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wg4dh"] Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.829500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6jxk\" (UniqueName: \"kubernetes.io/projected/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-kube-api-access-t6jxk\") pod \"barbican-db-sync-wg4dh\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.829579 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-db-sync-config-data\") pod \"barbican-db-sync-wg4dh\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.829596 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-combined-ca-bundle\") pod \"barbican-db-sync-wg4dh\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.932464 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6jxk\" (UniqueName: \"kubernetes.io/projected/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-kube-api-access-t6jxk\") pod \"barbican-db-sync-wg4dh\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.932655 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-db-sync-config-data\") pod \"barbican-db-sync-wg4dh\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.932685 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-combined-ca-bundle\") pod \"barbican-db-sync-wg4dh\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.938538 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-combined-ca-bundle\") pod \"barbican-db-sync-wg4dh\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.952564 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-db-sync-config-data\") pod \"barbican-db-sync-wg4dh\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:46 crc kubenswrapper[4763]: I1006 15:11:46.973312 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6jxk\" (UniqueName: \"kubernetes.io/projected/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-kube-api-access-t6jxk\") pod \"barbican-db-sync-wg4dh\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.027782 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.453834 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqjsv" event={"ID":"732c8907-25de-4ba0-a4ff-e9d70af7afbd","Type":"ContainerDied","Data":"d7fe3b88fa36a5ada99aa2b7a776cc9fff76c3ed0b58585d9243f6cccb8d0678"} Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.453873 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7fe3b88fa36a5ada99aa2b7a776cc9fff76c3ed0b58585d9243f6cccb8d0678" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.495663 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.645596 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-scripts\") pod \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.645672 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-credential-keys\") pod \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.645770 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-fernet-keys\") pod \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.645822 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkrch\" (UniqueName: \"kubernetes.io/projected/732c8907-25de-4ba0-a4ff-e9d70af7afbd-kube-api-access-dkrch\") pod \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.645863 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-combined-ca-bundle\") pod \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.645895 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-config-data\") pod \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\" (UID: \"732c8907-25de-4ba0-a4ff-e9d70af7afbd\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.655368 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-scripts" (OuterVolumeSpecName: "scripts") pod "732c8907-25de-4ba0-a4ff-e9d70af7afbd" (UID: "732c8907-25de-4ba0-a4ff-e9d70af7afbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.662440 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732c8907-25de-4ba0-a4ff-e9d70af7afbd-kube-api-access-dkrch" (OuterVolumeSpecName: "kube-api-access-dkrch") pod "732c8907-25de-4ba0-a4ff-e9d70af7afbd" (UID: "732c8907-25de-4ba0-a4ff-e9d70af7afbd"). InnerVolumeSpecName "kube-api-access-dkrch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.663225 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "732c8907-25de-4ba0-a4ff-e9d70af7afbd" (UID: "732c8907-25de-4ba0-a4ff-e9d70af7afbd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.682775 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "732c8907-25de-4ba0-a4ff-e9d70af7afbd" (UID: "732c8907-25de-4ba0-a4ff-e9d70af7afbd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.707965 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-config-data" (OuterVolumeSpecName: "config-data") pod "732c8907-25de-4ba0-a4ff-e9d70af7afbd" (UID: "732c8907-25de-4ba0-a4ff-e9d70af7afbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.711346 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "732c8907-25de-4ba0-a4ff-e9d70af7afbd" (UID: "732c8907-25de-4ba0-a4ff-e9d70af7afbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.747550 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.747579 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.747665 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.747675 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.747684 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkrch\" (UniqueName: \"kubernetes.io/projected/732c8907-25de-4ba0-a4ff-e9d70af7afbd-kube-api-access-dkrch\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.747692 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732c8907-25de-4ba0-a4ff-e9d70af7afbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.874863 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.951157 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"00b2db37-97c4-4edd-8596-67f10d9f3054\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.951246 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-httpd-run\") pod \"00b2db37-97c4-4edd-8596-67f10d9f3054\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.951302 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-scripts\") pod \"00b2db37-97c4-4edd-8596-67f10d9f3054\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.951365 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-internal-tls-certs\") pod \"00b2db37-97c4-4edd-8596-67f10d9f3054\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.951390 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-combined-ca-bundle\") pod \"00b2db37-97c4-4edd-8596-67f10d9f3054\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.951438 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlh2s\" (UniqueName: \"kubernetes.io/projected/00b2db37-97c4-4edd-8596-67f10d9f3054-kube-api-access-rlh2s\") pod \"00b2db37-97c4-4edd-8596-67f10d9f3054\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.951473 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-logs\") pod \"00b2db37-97c4-4edd-8596-67f10d9f3054\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.951499 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-config-data\") pod \"00b2db37-97c4-4edd-8596-67f10d9f3054\" (UID: \"00b2db37-97c4-4edd-8596-67f10d9f3054\") " Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.953142 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "00b2db37-97c4-4edd-8596-67f10d9f3054" (UID: "00b2db37-97c4-4edd-8596-67f10d9f3054"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.953205 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-logs" (OuterVolumeSpecName: "logs") pod "00b2db37-97c4-4edd-8596-67f10d9f3054" (UID: "00b2db37-97c4-4edd-8596-67f10d9f3054"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.956101 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "00b2db37-97c4-4edd-8596-67f10d9f3054" (UID: "00b2db37-97c4-4edd-8596-67f10d9f3054"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.957828 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-scripts" (OuterVolumeSpecName: "scripts") pod "00b2db37-97c4-4edd-8596-67f10d9f3054" (UID: "00b2db37-97c4-4edd-8596-67f10d9f3054"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.964496 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b2db37-97c4-4edd-8596-67f10d9f3054-kube-api-access-rlh2s" (OuterVolumeSpecName: "kube-api-access-rlh2s") pod "00b2db37-97c4-4edd-8596-67f10d9f3054" (UID: "00b2db37-97c4-4edd-8596-67f10d9f3054"). InnerVolumeSpecName "kube-api-access-rlh2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:47 crc kubenswrapper[4763]: I1006 15:11:47.989754 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00b2db37-97c4-4edd-8596-67f10d9f3054" (UID: "00b2db37-97c4-4edd-8596-67f10d9f3054"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.011906 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-config-data" (OuterVolumeSpecName: "config-data") pod "00b2db37-97c4-4edd-8596-67f10d9f3054" (UID: "00b2db37-97c4-4edd-8596-67f10d9f3054"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.013837 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "00b2db37-97c4-4edd-8596-67f10d9f3054" (UID: "00b2db37-97c4-4edd-8596-67f10d9f3054"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.053148 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlh2s\" (UniqueName: \"kubernetes.io/projected/00b2db37-97c4-4edd-8596-67f10d9f3054-kube-api-access-rlh2s\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.053188 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.053200 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.053241 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.053257 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00b2db37-97c4-4edd-8596-67f10d9f3054-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.053265 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.053273 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.053281 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b2db37-97c4-4edd-8596-67f10d9f3054-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.056355 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wg4dh"] Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.080416 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.154815 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.468225 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wg4dh" event={"ID":"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb","Type":"ContainerStarted","Data":"40efa4efe3ad24f9fd31ed2045edf51060a69ef740eefe9384898d4251830900"} Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.473272 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9dd38d95-a93a-489c-8e6b-e7a1a597ad13","Type":"ContainerStarted","Data":"0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21"} Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.477244 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00b2db37-97c4-4edd-8596-67f10d9f3054","Type":"ContainerDied","Data":"c61589dedacec4cfa0010e7ef0859b7b82f3724b9130beff64c02a8a47246a99"} Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.477344 4763 scope.go:117] "RemoveContainer" containerID="8fc1f419342b4b48292e6bcae78a7a6e69de704e7c951ffc50c1056a31f46075" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.477611 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.489835 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqjsv" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.489835 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ssv6p" event={"ID":"c5a00a8d-3928-4ece-9d1c-c3ca6993756b","Type":"ContainerStarted","Data":"8d4dbe263870060d5d730461509c5145f0a5edac849cd22428e9070476023189"} Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.524055 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ssv6p" podStartSLOduration=2.974030946 podStartE2EDuration="10.524035064s" podCreationTimestamp="2025-10-06 15:11:38 +0000 UTC" firstStartedPulling="2025-10-06 15:11:39.995779658 +0000 UTC m=+1097.151072170" lastFinishedPulling="2025-10-06 15:11:47.545783776 +0000 UTC m=+1104.701076288" observedRunningTime="2025-10-06 15:11:48.508583688 +0000 UTC m=+1105.663876200" watchObservedRunningTime="2025-10-06 15:11:48.524035064 +0000 UTC m=+1105.679327576" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.524716 4763 scope.go:117] "RemoveContainer" containerID="20ffda973e5c770f7783c0a1346dfdf869480b293cb3efe0e4b0e9937060e417" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.534965 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.557276 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.579519 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:11:48 crc kubenswrapper[4763]: E1006 15:11:48.580284 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b2db37-97c4-4edd-8596-67f10d9f3054" containerName="glance-httpd" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.580307 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b2db37-97c4-4edd-8596-67f10d9f3054" containerName="glance-httpd" Oct 06 15:11:48 crc kubenswrapper[4763]: E1006 15:11:48.580327 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b2db37-97c4-4edd-8596-67f10d9f3054" containerName="glance-log" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.580336 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b2db37-97c4-4edd-8596-67f10d9f3054" containerName="glance-log" Oct 06 15:11:48 crc kubenswrapper[4763]: E1006 15:11:48.580357 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732c8907-25de-4ba0-a4ff-e9d70af7afbd" containerName="keystone-bootstrap" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.580365 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="732c8907-25de-4ba0-a4ff-e9d70af7afbd" containerName="keystone-bootstrap" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.580590 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="732c8907-25de-4ba0-a4ff-e9d70af7afbd" containerName="keystone-bootstrap" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.580631 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b2db37-97c4-4edd-8596-67f10d9f3054" containerName="glance-log" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.580646 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b2db37-97c4-4edd-8596-67f10d9f3054" containerName="glance-httpd" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.581815 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.584852 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.585026 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.586812 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.670706 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hx9w\" (UniqueName: \"kubernetes.io/projected/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-kube-api-access-7hx9w\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.670768 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.670798 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.670994 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-logs\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.671031 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.671054 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.671110 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.671148 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.705456 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xqjsv"] Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.725195 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xqjsv"] Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.748296 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xbqjk"] Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.767257 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.769249 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.769480 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.769640 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxtw6" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.770800 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.771377 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xbqjk"] Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.772192 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.772219 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.772271 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.772296 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.772343 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hx9w\" (UniqueName: \"kubernetes.io/projected/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-kube-api-access-7hx9w\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.772366 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.772387 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.772417 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-logs\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.772871 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.773131 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-logs\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.778477 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.788064 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.789719 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.791910 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.793754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.795796 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hx9w\" (UniqueName: \"kubernetes.io/projected/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-kube-api-access-7hx9w\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.837016 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.867769 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.873661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llh9b\" (UniqueName: \"kubernetes.io/projected/a1611a98-63a9-434a-b45a-164a527fe97e-kube-api-access-llh9b\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.873783 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-scripts\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.873825 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-credential-keys\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.873861 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-config-data\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.873913 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-fernet-keys\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.873956 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-combined-ca-bundle\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.931639 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.974558 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-combined-ca-bundle\") pod \"f4f96bfd-096f-447b-9d35-e962146985a7\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.974667 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f4f96bfd-096f-447b-9d35-e962146985a7\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.974704 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-public-tls-certs\") pod \"f4f96bfd-096f-447b-9d35-e962146985a7\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.974731 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-config-data\") pod \"f4f96bfd-096f-447b-9d35-e962146985a7\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.974792 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-logs\") pod \"f4f96bfd-096f-447b-9d35-e962146985a7\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.975464 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk55b\" (UniqueName: \"kubernetes.io/projected/f4f96bfd-096f-447b-9d35-e962146985a7-kube-api-access-fk55b\") pod \"f4f96bfd-096f-447b-9d35-e962146985a7\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.975936 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-scripts\") pod \"f4f96bfd-096f-447b-9d35-e962146985a7\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.975946 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-logs" (OuterVolumeSpecName: "logs") pod "f4f96bfd-096f-447b-9d35-e962146985a7" (UID: "f4f96bfd-096f-447b-9d35-e962146985a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.976236 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-httpd-run\") pod \"f4f96bfd-096f-447b-9d35-e962146985a7\" (UID: \"f4f96bfd-096f-447b-9d35-e962146985a7\") " Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.976491 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-combined-ca-bundle\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.976544 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llh9b\" (UniqueName: \"kubernetes.io/projected/a1611a98-63a9-434a-b45a-164a527fe97e-kube-api-access-llh9b\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.976681 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-scripts\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.978854 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f4f96bfd-096f-447b-9d35-e962146985a7" (UID: "f4f96bfd-096f-447b-9d35-e962146985a7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.980864 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-scripts\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.982691 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-credential-keys\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.983795 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-config-data\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.984217 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-fernet-keys\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.985431 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.985605 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4f96bfd-096f-447b-9d35-e962146985a7-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.987270 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-credential-keys\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.987754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-combined-ca-bundle\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.987857 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f4f96bfd-096f-447b-9d35-e962146985a7" (UID: "f4f96bfd-096f-447b-9d35-e962146985a7"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.988772 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-scripts" (OuterVolumeSpecName: "scripts") pod "f4f96bfd-096f-447b-9d35-e962146985a7" (UID: "f4f96bfd-096f-447b-9d35-e962146985a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.988928 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f96bfd-096f-447b-9d35-e962146985a7-kube-api-access-fk55b" (OuterVolumeSpecName: "kube-api-access-fk55b") pod "f4f96bfd-096f-447b-9d35-e962146985a7" (UID: "f4f96bfd-096f-447b-9d35-e962146985a7"). InnerVolumeSpecName "kube-api-access-fk55b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.992185 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-config-data\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:48 crc kubenswrapper[4763]: I1006 15:11:48.994715 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-fernet-keys\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.000821 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llh9b\" (UniqueName: \"kubernetes.io/projected/a1611a98-63a9-434a-b45a-164a527fe97e-kube-api-access-llh9b\") pod \"keystone-bootstrap-xbqjk\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.023212 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4f96bfd-096f-447b-9d35-e962146985a7" (UID: "f4f96bfd-096f-447b-9d35-e962146985a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.038515 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-config-data" (OuterVolumeSpecName: "config-data") pod "f4f96bfd-096f-447b-9d35-e962146985a7" (UID: "f4f96bfd-096f-447b-9d35-e962146985a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.050013 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f4f96bfd-096f-447b-9d35-e962146985a7" (UID: "f4f96bfd-096f-447b-9d35-e962146985a7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.087642 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.087697 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.087708 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.087716 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.087726 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk55b\" (UniqueName: \"kubernetes.io/projected/f4f96bfd-096f-447b-9d35-e962146985a7-kube-api-access-fk55b\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.087736 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f96bfd-096f-447b-9d35-e962146985a7-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.106740 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.145882 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.190747 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.255392 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.330153 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-sj596"] Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.330866 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" podUID="35990813-4948-4085-b754-58e29f1f89c3" containerName="dnsmasq-dns" containerID="cri-o://f71af9f4b732e58384f766e8873c8757dc3c2ead7944e432f655225af6e9404f" gracePeriod=10 Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.484902 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.500460 4763 generic.go:334] "Generic (PLEG): container finished" podID="35990813-4948-4085-b754-58e29f1f89c3" containerID="f71af9f4b732e58384f766e8873c8757dc3c2ead7944e432f655225af6e9404f" exitCode=0 Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.500535 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" event={"ID":"35990813-4948-4085-b754-58e29f1f89c3","Type":"ContainerDied","Data":"f71af9f4b732e58384f766e8873c8757dc3c2ead7944e432f655225af6e9404f"} Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.502504 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4f96bfd-096f-447b-9d35-e962146985a7","Type":"ContainerDied","Data":"c7f170cde0f807500bab1588b4b6bf1e7f43342304b0671093caf761a7fc59c7"} Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.502526 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.502571 4763 scope.go:117] "RemoveContainer" containerID="4922ea53dc892a721efe57df0c9604a179cbbf9ccc5a0c971c08a39425796582" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.551482 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.557389 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.585184 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b2db37-97c4-4edd-8596-67f10d9f3054" path="/var/lib/kubelet/pods/00b2db37-97c4-4edd-8596-67f10d9f3054/volumes" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.585951 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="732c8907-25de-4ba0-a4ff-e9d70af7afbd" path="/var/lib/kubelet/pods/732c8907-25de-4ba0-a4ff-e9d70af7afbd/volumes" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.586500 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f96bfd-096f-447b-9d35-e962146985a7" path="/var/lib/kubelet/pods/f4f96bfd-096f-447b-9d35-e962146985a7/volumes" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.588798 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:11:49 crc kubenswrapper[4763]: E1006 15:11:49.589147 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f96bfd-096f-447b-9d35-e962146985a7" containerName="glance-httpd" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.589169 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f96bfd-096f-447b-9d35-e962146985a7" containerName="glance-httpd" Oct 06 15:11:49 crc kubenswrapper[4763]: E1006 15:11:49.589212 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f96bfd-096f-447b-9d35-e962146985a7" containerName="glance-log" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.589220 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f96bfd-096f-447b-9d35-e962146985a7" containerName="glance-log" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.589386 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f96bfd-096f-447b-9d35-e962146985a7" containerName="glance-log" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.589407 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f96bfd-096f-447b-9d35-e962146985a7" containerName="glance-httpd" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.595934 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.596039 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.599896 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.600203 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.650773 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xbqjk"] Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.700531 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-scripts\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.700583 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.700638 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.700757 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.700797 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-logs\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.700891 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.700934 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6rhw\" (UniqueName: \"kubernetes.io/projected/ad043fd9-8b69-43b9-a155-1e395d0f4685-kube-api-access-s6rhw\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.701021 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-config-data\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.805292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-config-data\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.805361 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-scripts\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.805388 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.805418 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.805438 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.805453 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-logs\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.805478 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.805510 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6rhw\" (UniqueName: \"kubernetes.io/projected/ad043fd9-8b69-43b9-a155-1e395d0f4685-kube-api-access-s6rhw\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.808799 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.809115 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-logs\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.809346 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.811862 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.815960 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-config-data\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.818078 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.825148 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-scripts\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.826783 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6rhw\" (UniqueName: \"kubernetes.io/projected/ad043fd9-8b69-43b9-a155-1e395d0f4685-kube-api-access-s6rhw\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.838675 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.919365 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:11:49 crc kubenswrapper[4763]: W1006 15:11:49.962640 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1611a98_63a9_434a_b45a_164a527fe97e.slice/crio-317647d3dba6977170d4ae8aef60f18bd01323d125f4585e8a3540b935109260 WatchSource:0}: Error finding container 317647d3dba6977170d4ae8aef60f18bd01323d125f4585e8a3540b935109260: Status 404 returned error can't find the container with id 317647d3dba6977170d4ae8aef60f18bd01323d125f4585e8a3540b935109260 Oct 06 15:11:49 crc kubenswrapper[4763]: W1006 15:11:49.964820 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf33bede1_4e9c_4226_b8e0_9ad2bb7ad80f.slice/crio-f1877118d0bd6a54fa7da026217e396e77f26e332b2b8e18fde8e512bb4b89f2 WatchSource:0}: Error finding container f1877118d0bd6a54fa7da026217e396e77f26e332b2b8e18fde8e512bb4b89f2: Status 404 returned error can't find the container with id f1877118d0bd6a54fa7da026217e396e77f26e332b2b8e18fde8e512bb4b89f2 Oct 06 15:11:49 crc kubenswrapper[4763]: I1006 15:11:49.981985 4763 scope.go:117] "RemoveContainer" containerID="3cd707e58f80b4b0759ac01dc89f74602175c17cb3c29535e5725ef255825ac8" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.269802 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.425313 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-config\") pod \"35990813-4948-4085-b754-58e29f1f89c3\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.425580 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-sb\") pod \"35990813-4948-4085-b754-58e29f1f89c3\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.425890 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-svc\") pod \"35990813-4948-4085-b754-58e29f1f89c3\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.425923 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfbns\" (UniqueName: \"kubernetes.io/projected/35990813-4948-4085-b754-58e29f1f89c3-kube-api-access-xfbns\") pod \"35990813-4948-4085-b754-58e29f1f89c3\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.426047 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-swift-storage-0\") pod \"35990813-4948-4085-b754-58e29f1f89c3\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.426090 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-nb\") pod \"35990813-4948-4085-b754-58e29f1f89c3\" (UID: \"35990813-4948-4085-b754-58e29f1f89c3\") " Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.429839 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35990813-4948-4085-b754-58e29f1f89c3-kube-api-access-xfbns" (OuterVolumeSpecName: "kube-api-access-xfbns") pod "35990813-4948-4085-b754-58e29f1f89c3" (UID: "35990813-4948-4085-b754-58e29f1f89c3"). InnerVolumeSpecName "kube-api-access-xfbns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.469190 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "35990813-4948-4085-b754-58e29f1f89c3" (UID: "35990813-4948-4085-b754-58e29f1f89c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.469245 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "35990813-4948-4085-b754-58e29f1f89c3" (UID: "35990813-4948-4085-b754-58e29f1f89c3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.474081 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "35990813-4948-4085-b754-58e29f1f89c3" (UID: "35990813-4948-4085-b754-58e29f1f89c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.474834 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-config" (OuterVolumeSpecName: "config") pod "35990813-4948-4085-b754-58e29f1f89c3" (UID: "35990813-4948-4085-b754-58e29f1f89c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.474991 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "35990813-4948-4085-b754-58e29f1f89c3" (UID: "35990813-4948-4085-b754-58e29f1f89c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.513367 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbqjk" event={"ID":"a1611a98-63a9-434a-b45a-164a527fe97e","Type":"ContainerStarted","Data":"b9e961308b9e2a5ad800c19e4dd19080c85b3ac3bc689989ac8c957262ec2a37"} Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.513412 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbqjk" event={"ID":"a1611a98-63a9-434a-b45a-164a527fe97e","Type":"ContainerStarted","Data":"317647d3dba6977170d4ae8aef60f18bd01323d125f4585e8a3540b935109260"} Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.520114 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" event={"ID":"35990813-4948-4085-b754-58e29f1f89c3","Type":"ContainerDied","Data":"6d0868553ed14d3458d7fc6d9c7bc15b5ccbc06978a725af0f97809a8767e272"} Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.520152 4763 scope.go:117] "RemoveContainer" containerID="f71af9f4b732e58384f766e8873c8757dc3c2ead7944e432f655225af6e9404f" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.520230 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-sj596" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.529418 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.529445 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.529456 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.529465 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfbns\" (UniqueName: \"kubernetes.io/projected/35990813-4948-4085-b754-58e29f1f89c3-kube-api-access-xfbns\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.529473 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.529481 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35990813-4948-4085-b754-58e29f1f89c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.530834 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9dd38d95-a93a-489c-8e6b-e7a1a597ad13","Type":"ContainerStarted","Data":"56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3"} Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.532077 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xbqjk" podStartSLOduration=2.532058383 podStartE2EDuration="2.532058383s" podCreationTimestamp="2025-10-06 15:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:50.528892726 +0000 UTC m=+1107.684185248" watchObservedRunningTime="2025-10-06 15:11:50.532058383 +0000 UTC m=+1107.687350895" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.537666 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f","Type":"ContainerStarted","Data":"f1877118d0bd6a54fa7da026217e396e77f26e332b2b8e18fde8e512bb4b89f2"} Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.552309 4763 scope.go:117] "RemoveContainer" containerID="c66b97056318e210f006c862a966562e2ccd023d40741fa72a11c33d161518e6" Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.556543 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-sj596"] Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.563282 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-sj596"] Oct 06 15:11:50 crc kubenswrapper[4763]: I1006 15:11:50.639648 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:11:51 crc kubenswrapper[4763]: I1006 15:11:51.548283 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad043fd9-8b69-43b9-a155-1e395d0f4685","Type":"ContainerStarted","Data":"cf6dde839e85f0a737f41c91e1e0200462667c5c36ba626cfbd076b18ca61db2"} Oct 06 15:11:51 crc kubenswrapper[4763]: I1006 15:11:51.548529 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad043fd9-8b69-43b9-a155-1e395d0f4685","Type":"ContainerStarted","Data":"05c623959fc0dfd300ec86ac454057decd6e955fd3375fb6906f70967f589922"} Oct 06 15:11:51 crc kubenswrapper[4763]: I1006 15:11:51.549784 4763 generic.go:334] "Generic (PLEG): container finished" podID="c5a00a8d-3928-4ece-9d1c-c3ca6993756b" containerID="8d4dbe263870060d5d730461509c5145f0a5edac849cd22428e9070476023189" exitCode=0 Oct 06 15:11:51 crc kubenswrapper[4763]: I1006 15:11:51.549851 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ssv6p" event={"ID":"c5a00a8d-3928-4ece-9d1c-c3ca6993756b","Type":"ContainerDied","Data":"8d4dbe263870060d5d730461509c5145f0a5edac849cd22428e9070476023189"} Oct 06 15:11:51 crc kubenswrapper[4763]: I1006 15:11:51.551170 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f","Type":"ContainerStarted","Data":"0890ad246b756974b6b74972304dcf73498cc71646d0dabba0480b45f1736aab"} Oct 06 15:11:51 crc kubenswrapper[4763]: I1006 15:11:51.587569 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35990813-4948-4085-b754-58e29f1f89c3" path="/var/lib/kubelet/pods/35990813-4948-4085-b754-58e29f1f89c3/volumes" Oct 06 15:11:52 crc kubenswrapper[4763]: I1006 15:11:52.993723 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.076251 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-logs\") pod \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.077091 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-scripts\") pod \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.077233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-combined-ca-bundle\") pod \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.077253 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-config-data\") pod \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.077275 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgxkm\" (UniqueName: \"kubernetes.io/projected/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-kube-api-access-mgxkm\") pod \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\" (UID: \"c5a00a8d-3928-4ece-9d1c-c3ca6993756b\") " Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.076960 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-logs" (OuterVolumeSpecName: "logs") pod "c5a00a8d-3928-4ece-9d1c-c3ca6993756b" (UID: "c5a00a8d-3928-4ece-9d1c-c3ca6993756b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.082020 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-scripts" (OuterVolumeSpecName: "scripts") pod "c5a00a8d-3928-4ece-9d1c-c3ca6993756b" (UID: "c5a00a8d-3928-4ece-9d1c-c3ca6993756b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.082142 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-kube-api-access-mgxkm" (OuterVolumeSpecName: "kube-api-access-mgxkm") pod "c5a00a8d-3928-4ece-9d1c-c3ca6993756b" (UID: "c5a00a8d-3928-4ece-9d1c-c3ca6993756b"). InnerVolumeSpecName "kube-api-access-mgxkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.101521 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5a00a8d-3928-4ece-9d1c-c3ca6993756b" (UID: "c5a00a8d-3928-4ece-9d1c-c3ca6993756b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.111696 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-config-data" (OuterVolumeSpecName: "config-data") pod "c5a00a8d-3928-4ece-9d1c-c3ca6993756b" (UID: "c5a00a8d-3928-4ece-9d1c-c3ca6993756b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.179591 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.179653 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.179670 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.179682 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgxkm\" (UniqueName: \"kubernetes.io/projected/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-kube-api-access-mgxkm\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.179696 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5a00a8d-3928-4ece-9d1c-c3ca6993756b-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.571517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ssv6p" event={"ID":"c5a00a8d-3928-4ece-9d1c-c3ca6993756b","Type":"ContainerDied","Data":"373c008c1de01ac3389befa2207e3917e47b9fd7a4c8ab08b5e03ce4722f3695"} Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.571557 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="373c008c1de01ac3389befa2207e3917e47b9fd7a4c8ab08b5e03ce4722f3695" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.571670 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ssv6p" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.586982 4763 generic.go:334] "Generic (PLEG): container finished" podID="a1611a98-63a9-434a-b45a-164a527fe97e" containerID="b9e961308b9e2a5ad800c19e4dd19080c85b3ac3bc689989ac8c957262ec2a37" exitCode=0 Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.588978 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbqjk" event={"ID":"a1611a98-63a9-434a-b45a-164a527fe97e","Type":"ContainerDied","Data":"b9e961308b9e2a5ad800c19e4dd19080c85b3ac3bc689989ac8c957262ec2a37"} Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.696502 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-79c97876dd-6hbjr"] Oct 06 15:11:53 crc kubenswrapper[4763]: E1006 15:11:53.697042 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a00a8d-3928-4ece-9d1c-c3ca6993756b" containerName="placement-db-sync" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.697064 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a00a8d-3928-4ece-9d1c-c3ca6993756b" containerName="placement-db-sync" Oct 06 15:11:53 crc kubenswrapper[4763]: E1006 15:11:53.697108 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35990813-4948-4085-b754-58e29f1f89c3" containerName="init" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.697116 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="35990813-4948-4085-b754-58e29f1f89c3" containerName="init" Oct 06 15:11:53 crc kubenswrapper[4763]: E1006 15:11:53.697129 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35990813-4948-4085-b754-58e29f1f89c3" containerName="dnsmasq-dns" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.697137 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="35990813-4948-4085-b754-58e29f1f89c3" containerName="dnsmasq-dns" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.697351 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="35990813-4948-4085-b754-58e29f1f89c3" containerName="dnsmasq-dns" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.697372 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a00a8d-3928-4ece-9d1c-c3ca6993756b" containerName="placement-db-sync" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.698499 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.706792 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.706965 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.707059 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.709778 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ngc52" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.709856 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.715199 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-79c97876dd-6hbjr"] Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.793151 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-scripts\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.793443 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d72n\" (UniqueName: \"kubernetes.io/projected/c465d0a4-ce55-49ff-bdd4-62585989b25b-kube-api-access-2d72n\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.793467 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-public-tls-certs\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.793491 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c465d0a4-ce55-49ff-bdd4-62585989b25b-logs\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.793660 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-config-data\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.794066 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-combined-ca-bundle\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.794194 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-internal-tls-certs\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.899366 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-scripts\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.899413 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d72n\" (UniqueName: \"kubernetes.io/projected/c465d0a4-ce55-49ff-bdd4-62585989b25b-kube-api-access-2d72n\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.899440 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-public-tls-certs\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.899475 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c465d0a4-ce55-49ff-bdd4-62585989b25b-logs\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.899540 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-config-data\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.899593 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-combined-ca-bundle\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.899639 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-internal-tls-certs\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.900110 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c465d0a4-ce55-49ff-bdd4-62585989b25b-logs\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.902731 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-scripts\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.903583 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-public-tls-certs\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.904068 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-internal-tls-certs\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.904151 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-combined-ca-bundle\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.921143 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-config-data\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:53 crc kubenswrapper[4763]: I1006 15:11:53.923462 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d72n\" (UniqueName: \"kubernetes.io/projected/c465d0a4-ce55-49ff-bdd4-62585989b25b-kube-api-access-2d72n\") pod \"placement-79c97876dd-6hbjr\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:54 crc kubenswrapper[4763]: I1006 15:11:54.049962 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:11:54 crc kubenswrapper[4763]: I1006 15:11:54.602330 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wg4dh" event={"ID":"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb","Type":"ContainerStarted","Data":"2353c95d5781717c2fe7a35581cf271acd024551bdd9e72ffc6a127579831157"} Oct 06 15:11:54 crc kubenswrapper[4763]: I1006 15:11:54.604762 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad043fd9-8b69-43b9-a155-1e395d0f4685","Type":"ContainerStarted","Data":"aca9843bff84c3cd1eed81cc10cd52d2a1fe3c06e4bcd6a0b65fd16efdfe4813"} Oct 06 15:11:54 crc kubenswrapper[4763]: I1006 15:11:54.608257 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f","Type":"ContainerStarted","Data":"aa5d91c1e7400c99fe72de25d10b1fc6f93eb69a791f36f8cd9a5adaaa5ff2aa"} Oct 06 15:11:54 crc kubenswrapper[4763]: I1006 15:11:54.651795 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wg4dh" podStartSLOduration=3.000762082 podStartE2EDuration="8.651777577s" podCreationTimestamp="2025-10-06 15:11:46 +0000 UTC" firstStartedPulling="2025-10-06 15:11:48.067902664 +0000 UTC m=+1105.223195176" lastFinishedPulling="2025-10-06 15:11:53.718918169 +0000 UTC m=+1110.874210671" observedRunningTime="2025-10-06 15:11:54.621170634 +0000 UTC m=+1111.776463146" watchObservedRunningTime="2025-10-06 15:11:54.651777577 +0000 UTC m=+1111.807070089" Oct 06 15:11:54 crc kubenswrapper[4763]: I1006 15:11:54.655479 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.655463139 podStartE2EDuration="5.655463139s" podCreationTimestamp="2025-10-06 15:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:54.647841819 +0000 UTC m=+1111.803134341" watchObservedRunningTime="2025-10-06 15:11:54.655463139 +0000 UTC m=+1111.810755651" Oct 06 15:11:54 crc kubenswrapper[4763]: I1006 15:11:54.682357 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.682329329 podStartE2EDuration="6.682329329s" podCreationTimestamp="2025-10-06 15:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:11:54.67944513 +0000 UTC m=+1111.834737652" watchObservedRunningTime="2025-10-06 15:11:54.682329329 +0000 UTC m=+1111.837621851" Oct 06 15:11:54 crc kubenswrapper[4763]: I1006 15:11:54.713794 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-79c97876dd-6hbjr"] Oct 06 15:11:54 crc kubenswrapper[4763]: W1006 15:11:54.722057 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc465d0a4_ce55_49ff_bdd4_62585989b25b.slice/crio-9b2c25fc5c7409f12cebdeab232ccd2d28aaf39b2736a313ee43a5668ff91f62 WatchSource:0}: Error finding container 9b2c25fc5c7409f12cebdeab232ccd2d28aaf39b2736a313ee43a5668ff91f62: Status 404 returned error can't find the container with id 9b2c25fc5c7409f12cebdeab232ccd2d28aaf39b2736a313ee43a5668ff91f62 Oct 06 15:11:54 crc kubenswrapper[4763]: I1006 15:11:54.955766 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.054877 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-fernet-keys\") pod \"a1611a98-63a9-434a-b45a-164a527fe97e\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.054981 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-combined-ca-bundle\") pod \"a1611a98-63a9-434a-b45a-164a527fe97e\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.055065 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-scripts\") pod \"a1611a98-63a9-434a-b45a-164a527fe97e\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.055108 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-credential-keys\") pod \"a1611a98-63a9-434a-b45a-164a527fe97e\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.055149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llh9b\" (UniqueName: \"kubernetes.io/projected/a1611a98-63a9-434a-b45a-164a527fe97e-kube-api-access-llh9b\") pod \"a1611a98-63a9-434a-b45a-164a527fe97e\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.055300 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-config-data\") pod \"a1611a98-63a9-434a-b45a-164a527fe97e\" (UID: \"a1611a98-63a9-434a-b45a-164a527fe97e\") " Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.060163 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a1611a98-63a9-434a-b45a-164a527fe97e" (UID: "a1611a98-63a9-434a-b45a-164a527fe97e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.062082 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-scripts" (OuterVolumeSpecName: "scripts") pod "a1611a98-63a9-434a-b45a-164a527fe97e" (UID: "a1611a98-63a9-434a-b45a-164a527fe97e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.062759 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1611a98-63a9-434a-b45a-164a527fe97e-kube-api-access-llh9b" (OuterVolumeSpecName: "kube-api-access-llh9b") pod "a1611a98-63a9-434a-b45a-164a527fe97e" (UID: "a1611a98-63a9-434a-b45a-164a527fe97e"). InnerVolumeSpecName "kube-api-access-llh9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.064591 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a1611a98-63a9-434a-b45a-164a527fe97e" (UID: "a1611a98-63a9-434a-b45a-164a527fe97e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.086569 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-config-data" (OuterVolumeSpecName: "config-data") pod "a1611a98-63a9-434a-b45a-164a527fe97e" (UID: "a1611a98-63a9-434a-b45a-164a527fe97e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.088408 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1611a98-63a9-434a-b45a-164a527fe97e" (UID: "a1611a98-63a9-434a-b45a-164a527fe97e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.157077 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.157403 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.157413 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.157425 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.157433 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1611a98-63a9-434a-b45a-164a527fe97e-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.157444 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llh9b\" (UniqueName: \"kubernetes.io/projected/a1611a98-63a9-434a-b45a-164a527fe97e-kube-api-access-llh9b\") on node \"crc\" DevicePath \"\"" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.670341 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79c97876dd-6hbjr" event={"ID":"c465d0a4-ce55-49ff-bdd4-62585989b25b","Type":"ContainerStarted","Data":"9b2c25fc5c7409f12cebdeab232ccd2d28aaf39b2736a313ee43a5668ff91f62"} Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.673797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbqjk" event={"ID":"a1611a98-63a9-434a-b45a-164a527fe97e","Type":"ContainerDied","Data":"317647d3dba6977170d4ae8aef60f18bd01323d125f4585e8a3540b935109260"} Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.673855 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="317647d3dba6977170d4ae8aef60f18bd01323d125f4585e8a3540b935109260" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.673961 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbqjk" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.731675 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-59bd69c9bf-v6zw6"] Oct 06 15:11:55 crc kubenswrapper[4763]: E1006 15:11:55.732076 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1611a98-63a9-434a-b45a-164a527fe97e" containerName="keystone-bootstrap" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.732093 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1611a98-63a9-434a-b45a-164a527fe97e" containerName="keystone-bootstrap" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.732277 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1611a98-63a9-434a-b45a-164a527fe97e" containerName="keystone-bootstrap" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.732845 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.737265 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxtw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.737499 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.737738 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.737884 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.738035 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.739279 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.750528 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-59bd69c9bf-v6zw6"] Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.872594 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-config-data\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.872660 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-credential-keys\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.872696 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-combined-ca-bundle\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.872721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-scripts\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.872771 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qb6h\" (UniqueName: \"kubernetes.io/projected/18ce3abd-750d-48db-a75f-e3a0d44e042d-kube-api-access-6qb6h\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.872797 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-internal-tls-certs\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.872828 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-public-tls-certs\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.872868 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-fernet-keys\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.976506 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-internal-tls-certs\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.976565 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-public-tls-certs\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.976668 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-fernet-keys\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.976699 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-config-data\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.976725 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-credential-keys\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.976750 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-combined-ca-bundle\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.976772 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-scripts\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.976817 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qb6h\" (UniqueName: \"kubernetes.io/projected/18ce3abd-750d-48db-a75f-e3a0d44e042d-kube-api-access-6qb6h\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.980301 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-scripts\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.981311 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-public-tls-certs\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.983283 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-fernet-keys\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.987183 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-combined-ca-bundle\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.988192 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-internal-tls-certs\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.992538 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-credential-keys\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.995138 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qb6h\" (UniqueName: \"kubernetes.io/projected/18ce3abd-750d-48db-a75f-e3a0d44e042d-kube-api-access-6qb6h\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:55 crc kubenswrapper[4763]: I1006 15:11:55.995939 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-config-data\") pod \"keystone-59bd69c9bf-v6zw6\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:56 crc kubenswrapper[4763]: I1006 15:11:56.061915 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:11:56 crc kubenswrapper[4763]: I1006 15:11:56.697105 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79c97876dd-6hbjr" event={"ID":"c465d0a4-ce55-49ff-bdd4-62585989b25b","Type":"ContainerStarted","Data":"260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3"} Oct 06 15:11:56 crc kubenswrapper[4763]: I1006 15:11:56.960165 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-59bd69c9bf-v6zw6"] Oct 06 15:11:56 crc kubenswrapper[4763]: W1006 15:11:56.969603 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18ce3abd_750d_48db_a75f_e3a0d44e042d.slice/crio-6ec565fbcae7b90e50e2cb126f8743cdaca85cbc711207b58eaf1fe128024b13 WatchSource:0}: Error finding container 6ec565fbcae7b90e50e2cb126f8743cdaca85cbc711207b58eaf1fe128024b13: Status 404 returned error can't find the container with id 6ec565fbcae7b90e50e2cb126f8743cdaca85cbc711207b58eaf1fe128024b13 Oct 06 15:11:57 crc kubenswrapper[4763]: I1006 15:11:57.709377 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-59bd69c9bf-v6zw6" event={"ID":"18ce3abd-750d-48db-a75f-e3a0d44e042d","Type":"ContainerStarted","Data":"6ec565fbcae7b90e50e2cb126f8743cdaca85cbc711207b58eaf1fe128024b13"} Oct 06 15:11:58 crc kubenswrapper[4763]: I1006 15:11:58.933103 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:58 crc kubenswrapper[4763]: I1006 15:11:58.933157 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:58 crc kubenswrapper[4763]: I1006 15:11:58.966350 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:58 crc kubenswrapper[4763]: I1006 15:11:58.971291 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:59 crc kubenswrapper[4763]: I1006 15:11:59.724397 4763 generic.go:334] "Generic (PLEG): container finished" podID="29c92ac5-96c4-45f2-9fbf-3c43dd548dbb" containerID="2353c95d5781717c2fe7a35581cf271acd024551bdd9e72ffc6a127579831157" exitCode=0 Oct 06 15:11:59 crc kubenswrapper[4763]: I1006 15:11:59.724485 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wg4dh" event={"ID":"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb","Type":"ContainerDied","Data":"2353c95d5781717c2fe7a35581cf271acd024551bdd9e72ffc6a127579831157"} Oct 06 15:11:59 crc kubenswrapper[4763]: I1006 15:11:59.724995 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:59 crc kubenswrapper[4763]: I1006 15:11:59.725019 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 15:11:59 crc kubenswrapper[4763]: I1006 15:11:59.919777 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 15:11:59 crc kubenswrapper[4763]: I1006 15:11:59.919820 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 15:11:59 crc kubenswrapper[4763]: I1006 15:11:59.953202 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 15:11:59 crc kubenswrapper[4763]: I1006 15:11:59.967460 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 15:12:00 crc kubenswrapper[4763]: I1006 15:12:00.742987 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 15:12:00 crc kubenswrapper[4763]: I1006 15:12:00.743285 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 15:12:01 crc kubenswrapper[4763]: I1006 15:12:01.629969 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:01 crc kubenswrapper[4763]: I1006 15:12:01.750735 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 15:12:01 crc kubenswrapper[4763]: I1006 15:12:01.759760 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:02 crc kubenswrapper[4763]: I1006 15:12:02.690023 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 15:12:02 crc kubenswrapper[4763]: I1006 15:12:02.757894 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 15:12:02 crc kubenswrapper[4763]: I1006 15:12:02.789608 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 15:12:04 crc kubenswrapper[4763]: I1006 15:12:04.777096 4763 generic.go:334] "Generic (PLEG): container finished" podID="48ec9a93-3d8c-49a0-8dcd-bb10f1270deb" containerID="fcf6164b2d181242c9b5562ef21adf8bd1d27334b6c36d6883ad95c6af383a2e" exitCode=0 Oct 06 15:12:04 crc kubenswrapper[4763]: I1006 15:12:04.777234 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p7v6n" event={"ID":"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb","Type":"ContainerDied","Data":"fcf6164b2d181242c9b5562ef21adf8bd1d27334b6c36d6883ad95c6af383a2e"} Oct 06 15:12:07 crc kubenswrapper[4763]: I1006 15:12:07.804406 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-59bd69c9bf-v6zw6" event={"ID":"18ce3abd-750d-48db-a75f-e3a0d44e042d","Type":"ContainerStarted","Data":"4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207"} Oct 06 15:12:07 crc kubenswrapper[4763]: I1006 15:12:07.804839 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:12:07 crc kubenswrapper[4763]: I1006 15:12:07.833789 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-59bd69c9bf-v6zw6" podStartSLOduration=12.833770995 podStartE2EDuration="12.833770995s" podCreationTimestamp="2025-10-06 15:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:07.821906268 +0000 UTC m=+1124.977198780" watchObservedRunningTime="2025-10-06 15:12:07.833770995 +0000 UTC m=+1124.989063507" Oct 06 15:12:08 crc kubenswrapper[4763]: E1006 15:12:08.247068 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Oct 06 15:12:08 crc kubenswrapper[4763]: E1006 15:12:08.247986 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mgh7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(9dd38d95-a93a-489c-8e6b-e7a1a597ad13): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.315710 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.321037 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-db-sync-config-data\") pod \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.321138 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6jxk\" (UniqueName: \"kubernetes.io/projected/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-kube-api-access-t6jxk\") pod \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.321186 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-combined-ca-bundle\") pod \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\" (UID: \"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb\") " Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.323261 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.327709 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-kube-api-access-t6jxk" (OuterVolumeSpecName: "kube-api-access-t6jxk") pod "29c92ac5-96c4-45f2-9fbf-3c43dd548dbb" (UID: "29c92ac5-96c4-45f2-9fbf-3c43dd548dbb"). InnerVolumeSpecName "kube-api-access-t6jxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.337522 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "29c92ac5-96c4-45f2-9fbf-3c43dd548dbb" (UID: "29c92ac5-96c4-45f2-9fbf-3c43dd548dbb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.364707 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29c92ac5-96c4-45f2-9fbf-3c43dd548dbb" (UID: "29c92ac5-96c4-45f2-9fbf-3c43dd548dbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.422185 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-combined-ca-bundle\") pod \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.422247 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7w4l\" (UniqueName: \"kubernetes.io/projected/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-kube-api-access-l7w4l\") pod \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.422289 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-config\") pod \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\" (UID: \"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb\") " Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.422514 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6jxk\" (UniqueName: \"kubernetes.io/projected/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-kube-api-access-t6jxk\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.422531 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.422540 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.425374 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-kube-api-access-l7w4l" (OuterVolumeSpecName: "kube-api-access-l7w4l") pod "48ec9a93-3d8c-49a0-8dcd-bb10f1270deb" (UID: "48ec9a93-3d8c-49a0-8dcd-bb10f1270deb"). InnerVolumeSpecName "kube-api-access-l7w4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.445907 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48ec9a93-3d8c-49a0-8dcd-bb10f1270deb" (UID: "48ec9a93-3d8c-49a0-8dcd-bb10f1270deb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.450444 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-config" (OuterVolumeSpecName: "config") pod "48ec9a93-3d8c-49a0-8dcd-bb10f1270deb" (UID: "48ec9a93-3d8c-49a0-8dcd-bb10f1270deb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.523471 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7w4l\" (UniqueName: \"kubernetes.io/projected/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-kube-api-access-l7w4l\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.523505 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.523517 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.814609 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p7v6n" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.814606 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p7v6n" event={"ID":"48ec9a93-3d8c-49a0-8dcd-bb10f1270deb","Type":"ContainerDied","Data":"ba7189eed299a84775c69b3750ec9675bc7ee9ab37becf8d1dd31895c50547d0"} Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.814769 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba7189eed299a84775c69b3750ec9675bc7ee9ab37becf8d1dd31895c50547d0" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.817736 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wg4dh" event={"ID":"29c92ac5-96c4-45f2-9fbf-3c43dd548dbb","Type":"ContainerDied","Data":"40efa4efe3ad24f9fd31ed2045edf51060a69ef740eefe9384898d4251830900"} Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.817778 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wg4dh" Oct 06 15:12:08 crc kubenswrapper[4763]: I1006 15:12:08.817792 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40efa4efe3ad24f9fd31ed2045edf51060a69ef740eefe9384898d4251830900" Oct 06 15:12:09 crc kubenswrapper[4763]: E1006 15:12:09.321788 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 06 15:12:09 crc kubenswrapper[4763]: E1006 15:12:09.321953 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mnbl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-vs2bl_openstack(eb17b8c1-c76b-4802-aac9-daaacea9e726): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 15:12:09 crc kubenswrapper[4763]: E1006 15:12:09.323240 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-vs2bl" podUID="eb17b8c1-c76b-4802-aac9-daaacea9e726" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.573913 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-lmxx7"] Oct 06 15:12:09 crc kubenswrapper[4763]: E1006 15:12:09.574773 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c92ac5-96c4-45f2-9fbf-3c43dd548dbb" containerName="barbican-db-sync" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.574802 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c92ac5-96c4-45f2-9fbf-3c43dd548dbb" containerName="barbican-db-sync" Oct 06 15:12:09 crc kubenswrapper[4763]: E1006 15:12:09.574814 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ec9a93-3d8c-49a0-8dcd-bb10f1270deb" containerName="neutron-db-sync" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.574821 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ec9a93-3d8c-49a0-8dcd-bb10f1270deb" containerName="neutron-db-sync" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.575072 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ec9a93-3d8c-49a0-8dcd-bb10f1270deb" containerName="neutron-db-sync" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.575105 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c92ac5-96c4-45f2-9fbf-3c43dd548dbb" containerName="barbican-db-sync" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.576211 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.589806 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-697995ff8c-7vbhx"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.591023 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.598352 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-697995ff8c-7vbhx"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.600544 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.600831 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.601026 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ktsj6" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.613645 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-lmxx7"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.664075 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5699bffc6b-r4hxp"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.665487 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.670046 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.675809 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5699bffc6b-r4hxp"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.747469 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7hrr\" (UniqueName: \"kubernetes.io/projected/0be9cfb2-36bd-45a7-8d15-1603cb76780a-kube-api-access-s7hrr\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.747536 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4qt5\" (UniqueName: \"kubernetes.io/projected/8c9526b4-61b0-4a91-b200-939a3639c4b3-kube-api-access-m4qt5\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.747561 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-config\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.747600 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data-custom\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.747715 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw625\" (UniqueName: \"kubernetes.io/projected/df7d4540-2c3e-45ad-88b7-544734bb0413-kube-api-access-sw625\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.747888 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-combined-ca-bundle\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.747918 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7d4540-2c3e-45ad-88b7-544734bb0413-logs\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.747988 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.748053 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.748079 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.748098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data-custom\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.748120 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be9cfb2-36bd-45a7-8d15-1603cb76780a-logs\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.748146 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.748232 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.748274 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.748310 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-combined-ca-bundle\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.784662 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-lmxx7"] Oct 06 15:12:09 crc kubenswrapper[4763]: E1006 15:12:09.785433 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-m4qt5 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" podUID="8c9526b4-61b0-4a91-b200-939a3639c4b3" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.803754 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-4d6jz"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.805198 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.821319 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-4d6jz"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.831798 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-66c6546c98-bp5zs"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.840153 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.840745 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66c6546c98-bp5zs"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.844975 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.845389 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.845485 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79c97876dd-6hbjr" event={"ID":"c465d0a4-ce55-49ff-bdd4-62585989b25b","Type":"ContainerStarted","Data":"ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1"} Oct 06 15:12:09 crc kubenswrapper[4763]: E1006 15:12:09.849439 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-vs2bl" podUID="eb17b8c1-c76b-4802-aac9-daaacea9e726" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850695 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw625\" (UniqueName: \"kubernetes.io/projected/df7d4540-2c3e-45ad-88b7-544734bb0413-kube-api-access-sw625\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850749 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-combined-ca-bundle\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850766 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7d4540-2c3e-45ad-88b7-544734bb0413-logs\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850790 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850819 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850837 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850854 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data-custom\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850869 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be9cfb2-36bd-45a7-8d15-1603cb76780a-logs\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850885 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850918 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850940 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-combined-ca-bundle\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.850982 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7hrr\" (UniqueName: \"kubernetes.io/projected/0be9cfb2-36bd-45a7-8d15-1603cb76780a-kube-api-access-s7hrr\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.851005 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4qt5\" (UniqueName: \"kubernetes.io/projected/8c9526b4-61b0-4a91-b200-939a3639c4b3-kube-api-access-m4qt5\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.851041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-config\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.851061 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data-custom\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.854210 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b97fbd7bd-9qwfz"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.856798 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be9cfb2-36bd-45a7-8d15-1603cb76780a-logs\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.857040 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.857806 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.859177 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.859226 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7d4540-2c3e-45ad-88b7-544734bb0413-logs\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.862681 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.863751 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.865303 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-config\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.865472 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data-custom\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.865638 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.865677 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.867843 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-crhfx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.870168 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data-custom\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.871006 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.873274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-combined-ca-bundle\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.873330 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-combined-ca-bundle\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.874907 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.883784 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.886200 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b97fbd7bd-9qwfz"] Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.898305 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw625\" (UniqueName: \"kubernetes.io/projected/df7d4540-2c3e-45ad-88b7-544734bb0413-kube-api-access-sw625\") pod \"barbican-keystone-listener-5699bffc6b-r4hxp\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.905346 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4qt5\" (UniqueName: \"kubernetes.io/projected/8c9526b4-61b0-4a91-b200-939a3639c4b3-kube-api-access-m4qt5\") pod \"dnsmasq-dns-84b966f6c9-lmxx7\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.909557 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7hrr\" (UniqueName: \"kubernetes.io/projected/0be9cfb2-36bd-45a7-8d15-1603cb76780a-kube-api-access-s7hrr\") pod \"barbican-worker-697995ff8c-7vbhx\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.929664 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.952860 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-combined-ca-bundle\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.952930 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-config\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.952947 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twlbn\" (UniqueName: \"kubernetes.io/projected/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-kube-api-access-twlbn\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.953795 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.953873 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-ovndb-tls-certs\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.953911 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.953979 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.954109 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.954141 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-logs\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.954177 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl7qc\" (UniqueName: \"kubernetes.io/projected/611dea12-329a-4168-9647-a5fa92453712-kube-api-access-kl7qc\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.954176 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-79c97876dd-6hbjr" podStartSLOduration=16.95415928 podStartE2EDuration="16.95415928s" podCreationTimestamp="2025-10-06 15:11:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:09.907974327 +0000 UTC m=+1127.063266859" watchObservedRunningTime="2025-10-06 15:12:09.95415928 +0000 UTC m=+1127.109451792" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.954224 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.954993 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data-custom\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.955104 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-httpd-config\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.955168 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-combined-ca-bundle\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.955312 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfqn8\" (UniqueName: \"kubernetes.io/projected/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-kube-api-access-xfqn8\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:09 crc kubenswrapper[4763]: I1006 15:12:09.955343 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-config\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.005203 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.018362 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058226 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-config\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058298 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twlbn\" (UniqueName: \"kubernetes.io/projected/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-kube-api-access-twlbn\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058325 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058360 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-ovndb-tls-certs\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058390 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058430 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058470 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058488 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-logs\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058506 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl7qc\" (UniqueName: \"kubernetes.io/projected/611dea12-329a-4168-9647-a5fa92453712-kube-api-access-kl7qc\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058524 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058556 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data-custom\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058583 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-httpd-config\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058628 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-combined-ca-bundle\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058647 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfqn8\" (UniqueName: \"kubernetes.io/projected/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-kube-api-access-xfqn8\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058666 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-config\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.058704 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-combined-ca-bundle\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.059485 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.066176 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data-custom\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.066457 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-logs\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.067329 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.067368 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.069393 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-config\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.069793 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-config\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.089269 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.097261 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-combined-ca-bundle\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.098468 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-httpd-config\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.121255 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.121802 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-ovndb-tls-certs\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.122991 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twlbn\" (UniqueName: \"kubernetes.io/projected/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-kube-api-access-twlbn\") pod \"barbican-api-66c6546c98-bp5zs\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.123784 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl7qc\" (UniqueName: \"kubernetes.io/projected/611dea12-329a-4168-9647-a5fa92453712-kube-api-access-kl7qc\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.124460 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfqn8\" (UniqueName: \"kubernetes.io/projected/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-kube-api-access-xfqn8\") pod \"dnsmasq-dns-75c8ddd69c-4d6jz\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.127212 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-combined-ca-bundle\") pod \"neutron-6b97fbd7bd-9qwfz\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.160236 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4qt5\" (UniqueName: \"kubernetes.io/projected/8c9526b4-61b0-4a91-b200-939a3639c4b3-kube-api-access-m4qt5\") pod \"8c9526b4-61b0-4a91-b200-939a3639c4b3\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.160294 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-svc\") pod \"8c9526b4-61b0-4a91-b200-939a3639c4b3\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.160360 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-config\") pod \"8c9526b4-61b0-4a91-b200-939a3639c4b3\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.160400 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-nb\") pod \"8c9526b4-61b0-4a91-b200-939a3639c4b3\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.160424 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-sb\") pod \"8c9526b4-61b0-4a91-b200-939a3639c4b3\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.160452 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-swift-storage-0\") pod \"8c9526b4-61b0-4a91-b200-939a3639c4b3\" (UID: \"8c9526b4-61b0-4a91-b200-939a3639c4b3\") " Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.162416 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8c9526b4-61b0-4a91-b200-939a3639c4b3" (UID: "8c9526b4-61b0-4a91-b200-939a3639c4b3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.162733 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c9526b4-61b0-4a91-b200-939a3639c4b3" (UID: "8c9526b4-61b0-4a91-b200-939a3639c4b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.162758 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c9526b4-61b0-4a91-b200-939a3639c4b3" (UID: "8c9526b4-61b0-4a91-b200-939a3639c4b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.162912 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-config" (OuterVolumeSpecName: "config") pod "8c9526b4-61b0-4a91-b200-939a3639c4b3" (UID: "8c9526b4-61b0-4a91-b200-939a3639c4b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.163089 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c9526b4-61b0-4a91-b200-939a3639c4b3" (UID: "8c9526b4-61b0-4a91-b200-939a3639c4b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.182804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c9526b4-61b0-4a91-b200-939a3639c4b3-kube-api-access-m4qt5" (OuterVolumeSpecName: "kube-api-access-m4qt5") pod "8c9526b4-61b0-4a91-b200-939a3639c4b3" (UID: "8c9526b4-61b0-4a91-b200-939a3639c4b3"). InnerVolumeSpecName "kube-api-access-m4qt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.262589 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.262845 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.262854 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.262863 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.262872 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4qt5\" (UniqueName: \"kubernetes.io/projected/8c9526b4-61b0-4a91-b200-939a3639c4b3-kube-api-access-m4qt5\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.262882 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c9526b4-61b0-4a91-b200-939a3639c4b3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.320044 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.424068 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.424133 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.616400 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-697995ff8c-7vbhx"] Oct 06 15:12:10 crc kubenswrapper[4763]: W1006 15:12:10.632043 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0be9cfb2_36bd_45a7_8d15_1603cb76780a.slice/crio-44c382bd4e7106e7b4e9eb034e5eb7c1f99b5a39005e626cd536a3da60ded40b WatchSource:0}: Error finding container 44c382bd4e7106e7b4e9eb034e5eb7c1f99b5a39005e626cd536a3da60ded40b: Status 404 returned error can't find the container with id 44c382bd4e7106e7b4e9eb034e5eb7c1f99b5a39005e626cd536a3da60ded40b Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.679947 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5699bffc6b-r4hxp"] Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.820586 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66c6546c98-bp5zs"] Oct 06 15:12:10 crc kubenswrapper[4763]: W1006 15:12:10.822364 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0f1458c_e1f3_438a_bdfc_1801ae47a68d.slice/crio-349baaf0f0538db7216713ead5f602e0698ea2e99b75a503a2ddcd8e6210537e WatchSource:0}: Error finding container 349baaf0f0538db7216713ead5f602e0698ea2e99b75a503a2ddcd8e6210537e: Status 404 returned error can't find the container with id 349baaf0f0538db7216713ead5f602e0698ea2e99b75a503a2ddcd8e6210537e Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.855883 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" event={"ID":"df7d4540-2c3e-45ad-88b7-544734bb0413","Type":"ContainerStarted","Data":"1925d66496eee672542fba5ca3de281b8471886dd5929a1a34819c22f33db164"} Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.855945 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-4d6jz"] Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.858061 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-697995ff8c-7vbhx" event={"ID":"0be9cfb2-36bd-45a7-8d15-1603cb76780a","Type":"ContainerStarted","Data":"44c382bd4e7106e7b4e9eb034e5eb7c1f99b5a39005e626cd536a3da60ded40b"} Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.860397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66c6546c98-bp5zs" event={"ID":"f0f1458c-e1f3-438a-bdfc-1801ae47a68d","Type":"ContainerStarted","Data":"349baaf0f0538db7216713ead5f602e0698ea2e99b75a503a2ddcd8e6210537e"} Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.860472 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-lmxx7" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.861035 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.861089 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.914001 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-lmxx7"] Oct 06 15:12:10 crc kubenswrapper[4763]: I1006 15:12:10.937834 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-lmxx7"] Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.070593 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b97fbd7bd-9qwfz"] Oct 06 15:12:11 crc kubenswrapper[4763]: W1006 15:12:11.087752 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod611dea12_329a_4168_9647_a5fa92453712.slice/crio-63d31e5c124b620ce299e19c9a00872b9c462ea03ac19bfd1fead87fae1ea538 WatchSource:0}: Error finding container 63d31e5c124b620ce299e19c9a00872b9c462ea03ac19bfd1fead87fae1ea538: Status 404 returned error can't find the container with id 63d31e5c124b620ce299e19c9a00872b9c462ea03ac19bfd1fead87fae1ea538 Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.590340 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c9526b4-61b0-4a91-b200-939a3639c4b3" path="/var/lib/kubelet/pods/8c9526b4-61b0-4a91-b200-939a3639c4b3/volumes" Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.871928 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66c6546c98-bp5zs" event={"ID":"f0f1458c-e1f3-438a-bdfc-1801ae47a68d","Type":"ContainerStarted","Data":"c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3"} Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.871968 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66c6546c98-bp5zs" event={"ID":"f0f1458c-e1f3-438a-bdfc-1801ae47a68d","Type":"ContainerStarted","Data":"b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095"} Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.872071 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.873736 4763 generic.go:334] "Generic (PLEG): container finished" podID="8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" containerID="d745b1a7526c0318fc2e000726c3ede4adb56bf47e3b8fa345950713ab04e7ea" exitCode=0 Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.873775 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" event={"ID":"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb","Type":"ContainerDied","Data":"d745b1a7526c0318fc2e000726c3ede4adb56bf47e3b8fa345950713ab04e7ea"} Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.873808 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" event={"ID":"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb","Type":"ContainerStarted","Data":"c3c6cc4df4b7ee80612fc88fcc663ae8b705cbc3733755ceda1edc46d708b0f9"} Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.879652 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b97fbd7bd-9qwfz" event={"ID":"611dea12-329a-4168-9647-a5fa92453712","Type":"ContainerStarted","Data":"0f3abaa43abd980ead73f222d61067f438dce0f473043b9f6581afd21b553dc8"} Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.879689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b97fbd7bd-9qwfz" event={"ID":"611dea12-329a-4168-9647-a5fa92453712","Type":"ContainerStarted","Data":"178342601b11f43b80ee57ac3844284165aca64d56c0cdd232573ba7de42e82b"} Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.879705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b97fbd7bd-9qwfz" event={"ID":"611dea12-329a-4168-9647-a5fa92453712","Type":"ContainerStarted","Data":"63d31e5c124b620ce299e19c9a00872b9c462ea03ac19bfd1fead87fae1ea538"} Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.964176 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-66c6546c98-bp5zs" podStartSLOduration=2.964151923 podStartE2EDuration="2.964151923s" podCreationTimestamp="2025-10-06 15:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:11.901583509 +0000 UTC m=+1129.056876021" watchObservedRunningTime="2025-10-06 15:12:11.964151923 +0000 UTC m=+1129.119444435" Oct 06 15:12:11 crc kubenswrapper[4763]: I1006 15:12:11.964943 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b97fbd7bd-9qwfz" podStartSLOduration=2.964934075 podStartE2EDuration="2.964934075s" podCreationTimestamp="2025-10-06 15:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:11.959131935 +0000 UTC m=+1129.114424447" watchObservedRunningTime="2025-10-06 15:12:11.964934075 +0000 UTC m=+1129.120226587" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.024373 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.668034 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b667cdf65-js4pw"] Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.676945 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.679471 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.681407 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.683761 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b667cdf65-js4pw"] Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.822925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-config\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.823113 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-public-tls-certs\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.823238 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-ovndb-tls-certs\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.823304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-internal-tls-certs\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.823335 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-httpd-config\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.823508 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-combined-ca-bundle\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.823560 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4gqw\" (UniqueName: \"kubernetes.io/projected/93a939be-54f9-4483-b37c-57e6d5b04f0d-kube-api-access-n4gqw\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.889108 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" event={"ID":"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb","Type":"ContainerStarted","Data":"6e9553bed3c4aaedc82dfaaa2230a429a9f286abaa71c5cbe7ac273500f6f912"} Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.889650 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.889810 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.913641 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" podStartSLOduration=3.91362429 podStartE2EDuration="3.91362429s" podCreationTimestamp="2025-10-06 15:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:12.912269492 +0000 UTC m=+1130.067562024" watchObservedRunningTime="2025-10-06 15:12:12.91362429 +0000 UTC m=+1130.068916802" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.925464 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-config\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.925526 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-public-tls-certs\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.925561 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-ovndb-tls-certs\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.925601 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-internal-tls-certs\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.925635 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-httpd-config\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.925658 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-combined-ca-bundle\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.925675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4gqw\" (UniqueName: \"kubernetes.io/projected/93a939be-54f9-4483-b37c-57e6d5b04f0d-kube-api-access-n4gqw\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.930501 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-combined-ca-bundle\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.930685 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-config\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.931524 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-httpd-config\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.931536 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-public-tls-certs\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.933413 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-internal-tls-certs\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.939895 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-ovndb-tls-certs\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:12 crc kubenswrapper[4763]: I1006 15:12:12.941231 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4gqw\" (UniqueName: \"kubernetes.io/projected/93a939be-54f9-4483-b37c-57e6d5b04f0d-kube-api-access-n4gqw\") pod \"neutron-5b667cdf65-js4pw\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:13 crc kubenswrapper[4763]: I1006 15:12:13.005860 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:13 crc kubenswrapper[4763]: I1006 15:12:13.901052 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:15 crc kubenswrapper[4763]: I1006 15:12:15.994435 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-85496b9568-j6pjn"] Oct 06 15:12:15 crc kubenswrapper[4763]: I1006 15:12:15.997015 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.000145 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.007932 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.018905 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85496b9568-j6pjn"] Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.082475 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qlpm\" (UniqueName: \"kubernetes.io/projected/7909e384-b1c8-476c-801d-8b60015ccdc4-kube-api-access-2qlpm\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.082542 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.082569 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-public-tls-certs\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.082810 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-internal-tls-certs\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.082980 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-combined-ca-bundle\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.083049 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data-custom\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.083083 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7909e384-b1c8-476c-801d-8b60015ccdc4-logs\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.184713 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data-custom\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.184756 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7909e384-b1c8-476c-801d-8b60015ccdc4-logs\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.184784 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qlpm\" (UniqueName: \"kubernetes.io/projected/7909e384-b1c8-476c-801d-8b60015ccdc4-kube-api-access-2qlpm\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.184822 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.184874 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-public-tls-certs\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.184948 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-internal-tls-certs\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.184988 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-combined-ca-bundle\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.185649 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7909e384-b1c8-476c-801d-8b60015ccdc4-logs\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.192118 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-combined-ca-bundle\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.192230 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-public-tls-certs\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.192469 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data-custom\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.192554 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-internal-tls-certs\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.192890 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.202236 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qlpm\" (UniqueName: \"kubernetes.io/projected/7909e384-b1c8-476c-801d-8b60015ccdc4-kube-api-access-2qlpm\") pod \"barbican-api-85496b9568-j6pjn\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:16 crc kubenswrapper[4763]: I1006 15:12:16.320465 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:18 crc kubenswrapper[4763]: I1006 15:12:18.722943 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b667cdf65-js4pw"] Oct 06 15:12:18 crc kubenswrapper[4763]: W1006 15:12:18.746831 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93a939be_54f9_4483_b37c_57e6d5b04f0d.slice/crio-6b9d2fcdd5150f0c45d912573a585533611f5c6dd30221d1e73989b2a7ac1ff4 WatchSource:0}: Error finding container 6b9d2fcdd5150f0c45d912573a585533611f5c6dd30221d1e73989b2a7ac1ff4: Status 404 returned error can't find the container with id 6b9d2fcdd5150f0c45d912573a585533611f5c6dd30221d1e73989b2a7ac1ff4 Oct 06 15:12:18 crc kubenswrapper[4763]: I1006 15:12:18.948420 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b667cdf65-js4pw" event={"ID":"93a939be-54f9-4483-b37c-57e6d5b04f0d","Type":"ContainerStarted","Data":"6b9d2fcdd5150f0c45d912573a585533611f5c6dd30221d1e73989b2a7ac1ff4"} Oct 06 15:12:19 crc kubenswrapper[4763]: W1006 15:12:19.162871 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7909e384_b1c8_476c_801d_8b60015ccdc4.slice/crio-cf63db15a7b55bfdddd784dd5ee14bcf822b33d86707ad079b125332466032d7 WatchSource:0}: Error finding container cf63db15a7b55bfdddd784dd5ee14bcf822b33d86707ad079b125332466032d7: Status 404 returned error can't find the container with id cf63db15a7b55bfdddd784dd5ee14bcf822b33d86707ad079b125332466032d7 Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.164908 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85496b9568-j6pjn"] Oct 06 15:12:19 crc kubenswrapper[4763]: E1006 15:12:19.262467 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.958579 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" event={"ID":"df7d4540-2c3e-45ad-88b7-544734bb0413","Type":"ContainerStarted","Data":"03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225"} Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.958671 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" event={"ID":"df7d4540-2c3e-45ad-88b7-544734bb0413","Type":"ContainerStarted","Data":"9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2"} Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.960820 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b667cdf65-js4pw" event={"ID":"93a939be-54f9-4483-b37c-57e6d5b04f0d","Type":"ContainerStarted","Data":"0ff5b5a9d760f2d9ecf1a7c7038e9965e60f8bcb7447f920abe0567b0a54badf"} Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.960868 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b667cdf65-js4pw" event={"ID":"93a939be-54f9-4483-b37c-57e6d5b04f0d","Type":"ContainerStarted","Data":"e605b0aaa421d3c856e90bbcb0d9a8126bc1d053474a70e3d68b5174771747d5"} Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.962073 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.963310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85496b9568-j6pjn" event={"ID":"7909e384-b1c8-476c-801d-8b60015ccdc4","Type":"ContainerStarted","Data":"4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431"} Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.963363 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85496b9568-j6pjn" event={"ID":"7909e384-b1c8-476c-801d-8b60015ccdc4","Type":"ContainerStarted","Data":"570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263"} Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.963378 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85496b9568-j6pjn" event={"ID":"7909e384-b1c8-476c-801d-8b60015ccdc4","Type":"ContainerStarted","Data":"cf63db15a7b55bfdddd784dd5ee14bcf822b33d86707ad079b125332466032d7"} Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.963789 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.963958 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.966644 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9dd38d95-a93a-489c-8e6b-e7a1a597ad13","Type":"ContainerStarted","Data":"a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4"} Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.966777 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="ceilometer-central-agent" containerID="cri-o://0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21" gracePeriod=30 Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.966878 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="proxy-httpd" containerID="cri-o://a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4" gracePeriod=30 Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.966795 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.967033 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="ceilometer-notification-agent" containerID="cri-o://56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3" gracePeriod=30 Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.969213 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-697995ff8c-7vbhx" event={"ID":"0be9cfb2-36bd-45a7-8d15-1603cb76780a","Type":"ContainerStarted","Data":"63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2"} Oct 06 15:12:19 crc kubenswrapper[4763]: I1006 15:12:19.969273 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-697995ff8c-7vbhx" event={"ID":"0be9cfb2-36bd-45a7-8d15-1603cb76780a","Type":"ContainerStarted","Data":"cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52"} Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.012541 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" podStartSLOduration=3.543591591 podStartE2EDuration="11.012521866s" podCreationTimestamp="2025-10-06 15:12:09 +0000 UTC" firstStartedPulling="2025-10-06 15:12:10.711318966 +0000 UTC m=+1127.866611468" lastFinishedPulling="2025-10-06 15:12:18.180249231 +0000 UTC m=+1135.335541743" observedRunningTime="2025-10-06 15:12:19.987648651 +0000 UTC m=+1137.142941173" watchObservedRunningTime="2025-10-06 15:12:20.012521866 +0000 UTC m=+1137.167814378" Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.016906 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b667cdf65-js4pw" podStartSLOduration=8.016890847 podStartE2EDuration="8.016890847s" podCreationTimestamp="2025-10-06 15:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:20.007024065 +0000 UTC m=+1137.162316597" watchObservedRunningTime="2025-10-06 15:12:20.016890847 +0000 UTC m=+1137.172183359" Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.048583 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-85496b9568-j6pjn" podStartSLOduration=5.04856843 podStartE2EDuration="5.04856843s" podCreationTimestamp="2025-10-06 15:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:20.045004732 +0000 UTC m=+1137.200297244" watchObservedRunningTime="2025-10-06 15:12:20.04856843 +0000 UTC m=+1137.203860942" Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.064810 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-697995ff8c-7vbhx" podStartSLOduration=2.955463504 podStartE2EDuration="11.064794307s" podCreationTimestamp="2025-10-06 15:12:09 +0000 UTC" firstStartedPulling="2025-10-06 15:12:10.636157205 +0000 UTC m=+1127.791449717" lastFinishedPulling="2025-10-06 15:12:18.745488008 +0000 UTC m=+1135.900780520" observedRunningTime="2025-10-06 15:12:20.063982955 +0000 UTC m=+1137.219275477" watchObservedRunningTime="2025-10-06 15:12:20.064794307 +0000 UTC m=+1137.220086819" Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.425597 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.481733 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dbwqg"] Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.482021 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" podUID="84f18dfb-e90b-4de9-bdfc-93d8a570f23a" containerName="dnsmasq-dns" containerID="cri-o://bed4a5f7dee4e8d86257c6804ba8b89ef1c3fa69210bede580ab29b417466c14" gracePeriod=10 Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.993760 4763 generic.go:334] "Generic (PLEG): container finished" podID="84f18dfb-e90b-4de9-bdfc-93d8a570f23a" containerID="bed4a5f7dee4e8d86257c6804ba8b89ef1c3fa69210bede580ab29b417466c14" exitCode=0 Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.993906 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" event={"ID":"84f18dfb-e90b-4de9-bdfc-93d8a570f23a","Type":"ContainerDied","Data":"bed4a5f7dee4e8d86257c6804ba8b89ef1c3fa69210bede580ab29b417466c14"} Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.995877 4763 generic.go:334] "Generic (PLEG): container finished" podID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerID="a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4" exitCode=0 Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.995889 4763 generic.go:334] "Generic (PLEG): container finished" podID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerID="0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21" exitCode=0 Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.996716 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9dd38d95-a93a-489c-8e6b-e7a1a597ad13","Type":"ContainerDied","Data":"a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4"} Oct 06 15:12:20 crc kubenswrapper[4763]: I1006 15:12:20.996737 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9dd38d95-a93a-489c-8e6b-e7a1a597ad13","Type":"ContainerDied","Data":"0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21"} Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.123326 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.269904 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-swift-storage-0\") pod \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.269968 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-nb\") pod \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.270004 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-svc\") pod \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.270031 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-sb\") pod \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.270116 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-config\") pod \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.270197 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25l8w\" (UniqueName: \"kubernetes.io/projected/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-kube-api-access-25l8w\") pod \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.278062 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-kube-api-access-25l8w" (OuterVolumeSpecName: "kube-api-access-25l8w") pod "84f18dfb-e90b-4de9-bdfc-93d8a570f23a" (UID: "84f18dfb-e90b-4de9-bdfc-93d8a570f23a"). InnerVolumeSpecName "kube-api-access-25l8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.331258 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84f18dfb-e90b-4de9-bdfc-93d8a570f23a" (UID: "84f18dfb-e90b-4de9-bdfc-93d8a570f23a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.332154 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "84f18dfb-e90b-4de9-bdfc-93d8a570f23a" (UID: "84f18dfb-e90b-4de9-bdfc-93d8a570f23a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.338097 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84f18dfb-e90b-4de9-bdfc-93d8a570f23a" (UID: "84f18dfb-e90b-4de9-bdfc-93d8a570f23a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.338170 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84f18dfb-e90b-4de9-bdfc-93d8a570f23a" (UID: "84f18dfb-e90b-4de9-bdfc-93d8a570f23a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:21 crc kubenswrapper[4763]: E1006 15:12:21.338198 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-config podName:84f18dfb-e90b-4de9-bdfc-93d8a570f23a nodeName:}" failed. No retries permitted until 2025-10-06 15:12:21.838167969 +0000 UTC m=+1138.993460491 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-config") pod "84f18dfb-e90b-4de9-bdfc-93d8a570f23a" (UID: "84f18dfb-e90b-4de9-bdfc-93d8a570f23a") : error deleting /var/lib/kubelet/pods/84f18dfb-e90b-4de9-bdfc-93d8a570f23a/volume-subpaths: remove /var/lib/kubelet/pods/84f18dfb-e90b-4de9-bdfc-93d8a570f23a/volume-subpaths: no such file or directory Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.372883 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25l8w\" (UniqueName: \"kubernetes.io/projected/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-kube-api-access-25l8w\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.372924 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.372934 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.372952 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.372963 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.857460 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.876408 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.881845 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-config\") pod \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\" (UID: \"84f18dfb-e90b-4de9-bdfc-93d8a570f23a\") " Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.882812 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-config" (OuterVolumeSpecName: "config") pod "84f18dfb-e90b-4de9-bdfc-93d8a570f23a" (UID: "84f18dfb-e90b-4de9-bdfc-93d8a570f23a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:21 crc kubenswrapper[4763]: I1006 15:12:21.983872 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f18dfb-e90b-4de9-bdfc-93d8a570f23a-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:22 crc kubenswrapper[4763]: I1006 15:12:22.006906 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" event={"ID":"84f18dfb-e90b-4de9-bdfc-93d8a570f23a","Type":"ContainerDied","Data":"533d492f2a5841d54bce55e4b46bc5976ea884c66dab3bf1bd36b936aec769c9"} Oct 06 15:12:22 crc kubenswrapper[4763]: I1006 15:12:22.006974 4763 scope.go:117] "RemoveContainer" containerID="bed4a5f7dee4e8d86257c6804ba8b89ef1c3fa69210bede580ab29b417466c14" Oct 06 15:12:22 crc kubenswrapper[4763]: I1006 15:12:22.007186 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dbwqg" Oct 06 15:12:22 crc kubenswrapper[4763]: I1006 15:12:22.038593 4763 scope.go:117] "RemoveContainer" containerID="638eff170fd262a59d2315d59828df4187b161759a932ecbb6e6672aa691dc1f" Oct 06 15:12:22 crc kubenswrapper[4763]: I1006 15:12:22.059242 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dbwqg"] Oct 06 15:12:22 crc kubenswrapper[4763]: I1006 15:12:22.070369 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dbwqg"] Oct 06 15:12:22 crc kubenswrapper[4763]: I1006 15:12:22.904506 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.008312 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-combined-ca-bundle\") pod \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.008386 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-config-data\") pod \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.008441 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-sg-core-conf-yaml\") pod \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.008481 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgh7b\" (UniqueName: \"kubernetes.io/projected/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-kube-api-access-mgh7b\") pod \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.008506 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-scripts\") pod \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.008541 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-log-httpd\") pod \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.008592 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-run-httpd\") pod \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\" (UID: \"9dd38d95-a93a-489c-8e6b-e7a1a597ad13\") " Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.009222 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9dd38d95-a93a-489c-8e6b-e7a1a597ad13" (UID: "9dd38d95-a93a-489c-8e6b-e7a1a597ad13"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.009233 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9dd38d95-a93a-489c-8e6b-e7a1a597ad13" (UID: "9dd38d95-a93a-489c-8e6b-e7a1a597ad13"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.013315 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9dd38d95-a93a-489c-8e6b-e7a1a597ad13" (UID: "9dd38d95-a93a-489c-8e6b-e7a1a597ad13"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.013400 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-scripts" (OuterVolumeSpecName: "scripts") pod "9dd38d95-a93a-489c-8e6b-e7a1a597ad13" (UID: "9dd38d95-a93a-489c-8e6b-e7a1a597ad13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.014975 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-kube-api-access-mgh7b" (OuterVolumeSpecName: "kube-api-access-mgh7b") pod "9dd38d95-a93a-489c-8e6b-e7a1a597ad13" (UID: "9dd38d95-a93a-489c-8e6b-e7a1a597ad13"). InnerVolumeSpecName "kube-api-access-mgh7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.027285 4763 generic.go:334] "Generic (PLEG): container finished" podID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerID="56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3" exitCode=0 Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.027347 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9dd38d95-a93a-489c-8e6b-e7a1a597ad13","Type":"ContainerDied","Data":"56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3"} Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.027383 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9dd38d95-a93a-489c-8e6b-e7a1a597ad13","Type":"ContainerDied","Data":"825b8fbb51dc12bddc29920049a1f9e0ff685e4e8b9bc08cc8d0f864ca4780c8"} Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.027422 4763 scope.go:117] "RemoveContainer" containerID="a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.027650 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.053827 4763 scope.go:117] "RemoveContainer" containerID="56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.071152 4763 scope.go:117] "RemoveContainer" containerID="0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.089954 4763 scope.go:117] "RemoveContainer" containerID="a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4" Oct 06 15:12:23 crc kubenswrapper[4763]: E1006 15:12:23.093049 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4\": container with ID starting with a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4 not found: ID does not exist" containerID="a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.093088 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4"} err="failed to get container status \"a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4\": rpc error: code = NotFound desc = could not find container \"a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4\": container with ID starting with a81dc090696d3f6ff80e815ecee3f29f816334e3a05747c6415c77e20ac6ceb4 not found: ID does not exist" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.093110 4763 scope.go:117] "RemoveContainer" containerID="56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3" Oct 06 15:12:23 crc kubenswrapper[4763]: E1006 15:12:23.093531 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3\": container with ID starting with 56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3 not found: ID does not exist" containerID="56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.093575 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3"} err="failed to get container status \"56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3\": rpc error: code = NotFound desc = could not find container \"56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3\": container with ID starting with 56dcaf62383f4792e6831b2dd9d1db849fd2eca01f7d98fdf704ad88ace8f4c3 not found: ID does not exist" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.093609 4763 scope.go:117] "RemoveContainer" containerID="0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21" Oct 06 15:12:23 crc kubenswrapper[4763]: E1006 15:12:23.093935 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21\": container with ID starting with 0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21 not found: ID does not exist" containerID="0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.093959 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21"} err="failed to get container status \"0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21\": rpc error: code = NotFound desc = could not find container \"0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21\": container with ID starting with 0ea69b10ea5e900928ca775ec06abc0ec17fe940e7a906d316d27170e51d2c21 not found: ID does not exist" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.104424 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dd38d95-a93a-489c-8e6b-e7a1a597ad13" (UID: "9dd38d95-a93a-489c-8e6b-e7a1a597ad13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.110786 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.110873 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.110888 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.110898 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgh7b\" (UniqueName: \"kubernetes.io/projected/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-kube-api-access-mgh7b\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.110910 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.110921 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.129038 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-config-data" (OuterVolumeSpecName: "config-data") pod "9dd38d95-a93a-489c-8e6b-e7a1a597ad13" (UID: "9dd38d95-a93a-489c-8e6b-e7a1a597ad13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.212847 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd38d95-a93a-489c-8e6b-e7a1a597ad13-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.401689 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.401739 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.423144 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:23 crc kubenswrapper[4763]: E1006 15:12:23.423558 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="ceilometer-central-agent" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.423579 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="ceilometer-central-agent" Oct 06 15:12:23 crc kubenswrapper[4763]: E1006 15:12:23.423628 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f18dfb-e90b-4de9-bdfc-93d8a570f23a" containerName="init" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.423638 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f18dfb-e90b-4de9-bdfc-93d8a570f23a" containerName="init" Oct 06 15:12:23 crc kubenswrapper[4763]: E1006 15:12:23.423657 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f18dfb-e90b-4de9-bdfc-93d8a570f23a" containerName="dnsmasq-dns" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.423666 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f18dfb-e90b-4de9-bdfc-93d8a570f23a" containerName="dnsmasq-dns" Oct 06 15:12:23 crc kubenswrapper[4763]: E1006 15:12:23.423694 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="proxy-httpd" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.423703 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="proxy-httpd" Oct 06 15:12:23 crc kubenswrapper[4763]: E1006 15:12:23.423720 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="ceilometer-notification-agent" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.423754 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="ceilometer-notification-agent" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.423948 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="proxy-httpd" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.423970 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="ceilometer-central-agent" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.423992 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" containerName="ceilometer-notification-agent" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.424011 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84f18dfb-e90b-4de9-bdfc-93d8a570f23a" containerName="dnsmasq-dns" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.426941 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.429659 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.429973 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.447318 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.523306 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-config-data\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.523345 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmts9\" (UniqueName: \"kubernetes.io/projected/ed442e9b-5025-42ba-b723-3da614839bde-kube-api-access-jmts9\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.523366 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-scripts\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.523383 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.523418 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-run-httpd\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.523540 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-log-httpd\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.523573 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.589163 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f18dfb-e90b-4de9-bdfc-93d8a570f23a" path="/var/lib/kubelet/pods/84f18dfb-e90b-4de9-bdfc-93d8a570f23a/volumes" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.590720 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd38d95-a93a-489c-8e6b-e7a1a597ad13" path="/var/lib/kubelet/pods/9dd38d95-a93a-489c-8e6b-e7a1a597ad13/volumes" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.625209 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-log-httpd\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.625276 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.625297 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmts9\" (UniqueName: \"kubernetes.io/projected/ed442e9b-5025-42ba-b723-3da614839bde-kube-api-access-jmts9\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.625312 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-config-data\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.625326 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-scripts\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.625340 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.625372 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-run-httpd\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.625725 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-log-httpd\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.625857 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-run-httpd\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.627659 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.627713 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.633708 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.639461 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-scripts\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.640763 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-config-data\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.641091 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.651514 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmts9\" (UniqueName: \"kubernetes.io/projected/ed442e9b-5025-42ba-b723-3da614839bde-kube-api-access-jmts9\") pod \"ceilometer-0\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " pod="openstack/ceilometer-0" Oct 06 15:12:23 crc kubenswrapper[4763]: I1006 15:12:23.763460 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:24 crc kubenswrapper[4763]: I1006 15:12:24.057099 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vs2bl" event={"ID":"eb17b8c1-c76b-4802-aac9-daaacea9e726","Type":"ContainerStarted","Data":"8bcc49ab811eb212bd4375f7cdbde3ddbb5e4e78565e5c34a1a7eda5d16ff196"} Oct 06 15:12:24 crc kubenswrapper[4763]: I1006 15:12:24.062997 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:12:24 crc kubenswrapper[4763]: I1006 15:12:24.081847 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vs2bl" podStartSLOduration=3.2518752490000002 podStartE2EDuration="43.08182933s" podCreationTimestamp="2025-10-06 15:11:41 +0000 UTC" firstStartedPulling="2025-10-06 15:11:42.723931292 +0000 UTC m=+1099.879223804" lastFinishedPulling="2025-10-06 15:12:22.553885373 +0000 UTC m=+1139.709177885" observedRunningTime="2025-10-06 15:12:24.079887157 +0000 UTC m=+1141.235179669" watchObservedRunningTime="2025-10-06 15:12:24.08182933 +0000 UTC m=+1141.237121842" Oct 06 15:12:24 crc kubenswrapper[4763]: I1006 15:12:24.263405 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:25 crc kubenswrapper[4763]: I1006 15:12:25.081788 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed442e9b-5025-42ba-b723-3da614839bde","Type":"ContainerStarted","Data":"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194"} Oct 06 15:12:25 crc kubenswrapper[4763]: I1006 15:12:25.082822 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed442e9b-5025-42ba-b723-3da614839bde","Type":"ContainerStarted","Data":"00f599a06447fcb70ae4a790bb85a06b2cf4fa4dbc3d86106bb61db98be1d6a5"} Oct 06 15:12:27 crc kubenswrapper[4763]: I1006 15:12:27.100033 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed442e9b-5025-42ba-b723-3da614839bde","Type":"ContainerStarted","Data":"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5"} Oct 06 15:12:27 crc kubenswrapper[4763]: I1006 15:12:27.641364 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:27 crc kubenswrapper[4763]: I1006 15:12:27.731297 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:12:27 crc kubenswrapper[4763]: I1006 15:12:27.972442 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:12:28 crc kubenswrapper[4763]: I1006 15:12:28.037544 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-66c6546c98-bp5zs"] Oct 06 15:12:28 crc kubenswrapper[4763]: I1006 15:12:28.037794 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-66c6546c98-bp5zs" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerName="barbican-api-log" containerID="cri-o://b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095" gracePeriod=30 Oct 06 15:12:28 crc kubenswrapper[4763]: I1006 15:12:28.037943 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-66c6546c98-bp5zs" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerName="barbican-api" containerID="cri-o://c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3" gracePeriod=30 Oct 06 15:12:28 crc kubenswrapper[4763]: I1006 15:12:28.110138 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed442e9b-5025-42ba-b723-3da614839bde","Type":"ContainerStarted","Data":"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7"} Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.121403 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb17b8c1-c76b-4802-aac9-daaacea9e726" containerID="8bcc49ab811eb212bd4375f7cdbde3ddbb5e4e78565e5c34a1a7eda5d16ff196" exitCode=0 Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.121591 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vs2bl" event={"ID":"eb17b8c1-c76b-4802-aac9-daaacea9e726","Type":"ContainerDied","Data":"8bcc49ab811eb212bd4375f7cdbde3ddbb5e4e78565e5c34a1a7eda5d16ff196"} Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.125111 4763 generic.go:334] "Generic (PLEG): container finished" podID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerID="b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095" exitCode=143 Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.125153 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66c6546c98-bp5zs" event={"ID":"f0f1458c-e1f3-438a-bdfc-1801ae47a68d","Type":"ContainerDied","Data":"b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095"} Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.139667 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.142526 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.148839 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-g7jfp" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.149417 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.150800 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.167242 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.225884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9mc9\" (UniqueName: \"kubernetes.io/projected/f62b41ca-5e6d-4760-be62-af924b841737-kube-api-access-c9mc9\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.225994 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.226034 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config-secret\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.226090 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.329945 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.330032 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config-secret\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.330103 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.330143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9mc9\" (UniqueName: \"kubernetes.io/projected/f62b41ca-5e6d-4760-be62-af924b841737-kube-api-access-c9mc9\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.331866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.339351 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.349806 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config-secret\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.360542 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9mc9\" (UniqueName: \"kubernetes.io/projected/f62b41ca-5e6d-4760-be62-af924b841737-kube-api-access-c9mc9\") pod \"openstackclient\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.507544 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 15:12:29 crc kubenswrapper[4763]: I1006 15:12:29.971216 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.133159 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f62b41ca-5e6d-4760-be62-af924b841737","Type":"ContainerStarted","Data":"82d100606595952a381bb5b067bec6dac1882d874c1f7890b41cb85a715fe7c9"} Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.136933 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed442e9b-5025-42ba-b723-3da614839bde","Type":"ContainerStarted","Data":"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805"} Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.136973 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.158686 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.611307971 podStartE2EDuration="7.158666571s" podCreationTimestamp="2025-10-06 15:12:23 +0000 UTC" firstStartedPulling="2025-10-06 15:12:24.307204701 +0000 UTC m=+1141.462497213" lastFinishedPulling="2025-10-06 15:12:28.854563301 +0000 UTC m=+1146.009855813" observedRunningTime="2025-10-06 15:12:30.155102002 +0000 UTC m=+1147.310394514" watchObservedRunningTime="2025-10-06 15:12:30.158666571 +0000 UTC m=+1147.313959093" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.520974 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.558411 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb17b8c1-c76b-4802-aac9-daaacea9e726-etc-machine-id\") pod \"eb17b8c1-c76b-4802-aac9-daaacea9e726\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.558492 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-scripts\") pod \"eb17b8c1-c76b-4802-aac9-daaacea9e726\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.558640 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-config-data\") pod \"eb17b8c1-c76b-4802-aac9-daaacea9e726\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.558690 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-db-sync-config-data\") pod \"eb17b8c1-c76b-4802-aac9-daaacea9e726\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.558748 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mnbl\" (UniqueName: \"kubernetes.io/projected/eb17b8c1-c76b-4802-aac9-daaacea9e726-kube-api-access-9mnbl\") pod \"eb17b8c1-c76b-4802-aac9-daaacea9e726\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.558780 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb17b8c1-c76b-4802-aac9-daaacea9e726-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eb17b8c1-c76b-4802-aac9-daaacea9e726" (UID: "eb17b8c1-c76b-4802-aac9-daaacea9e726"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.558796 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-combined-ca-bundle\") pod \"eb17b8c1-c76b-4802-aac9-daaacea9e726\" (UID: \"eb17b8c1-c76b-4802-aac9-daaacea9e726\") " Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.559288 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb17b8c1-c76b-4802-aac9-daaacea9e726-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.567596 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "eb17b8c1-c76b-4802-aac9-daaacea9e726" (UID: "eb17b8c1-c76b-4802-aac9-daaacea9e726"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.568527 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-scripts" (OuterVolumeSpecName: "scripts") pod "eb17b8c1-c76b-4802-aac9-daaacea9e726" (UID: "eb17b8c1-c76b-4802-aac9-daaacea9e726"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.577279 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb17b8c1-c76b-4802-aac9-daaacea9e726-kube-api-access-9mnbl" (OuterVolumeSpecName: "kube-api-access-9mnbl") pod "eb17b8c1-c76b-4802-aac9-daaacea9e726" (UID: "eb17b8c1-c76b-4802-aac9-daaacea9e726"). InnerVolumeSpecName "kube-api-access-9mnbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.601775 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb17b8c1-c76b-4802-aac9-daaacea9e726" (UID: "eb17b8c1-c76b-4802-aac9-daaacea9e726"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.617876 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-config-data" (OuterVolumeSpecName: "config-data") pod "eb17b8c1-c76b-4802-aac9-daaacea9e726" (UID: "eb17b8c1-c76b-4802-aac9-daaacea9e726"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.661454 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.661487 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.661499 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mnbl\" (UniqueName: \"kubernetes.io/projected/eb17b8c1-c76b-4802-aac9-daaacea9e726-kube-api-access-9mnbl\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.661580 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:30 crc kubenswrapper[4763]: I1006 15:12:30.661679 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb17b8c1-c76b-4802-aac9-daaacea9e726-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.145024 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vs2bl" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.145018 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vs2bl" event={"ID":"eb17b8c1-c76b-4802-aac9-daaacea9e726","Type":"ContainerDied","Data":"89efe739a070eb3a75ef7cb05c7213101ab8250e11ce9c7c361bc5f936a78e16"} Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.145414 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89efe739a070eb3a75ef7cb05c7213101ab8250e11ce9c7c361bc5f936a78e16" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.224698 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66c6546c98-bp5zs" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:57272->10.217.0.155:9311: read: connection reset by peer" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.224747 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66c6546c98-bp5zs" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:57262->10.217.0.155:9311: read: connection reset by peer" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.433807 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:12:31 crc kubenswrapper[4763]: E1006 15:12:31.434225 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb17b8c1-c76b-4802-aac9-daaacea9e726" containerName="cinder-db-sync" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.434247 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb17b8c1-c76b-4802-aac9-daaacea9e726" containerName="cinder-db-sync" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.434476 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb17b8c1-c76b-4802-aac9-daaacea9e726" containerName="cinder-db-sync" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.435572 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.439184 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.439438 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hvbxm" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.439543 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.439796 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.451096 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.479527 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ae39c8a-af64-489c-bb82-27c32c21279e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.479915 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.480042 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.480123 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.480224 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7l9d\" (UniqueName: \"kubernetes.io/projected/7ae39c8a-af64-489c-bb82-27c32c21279e-kube-api-access-v7l9d\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.480290 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-scripts\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.515737 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-nft57"] Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.522579 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.531029 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-nft57"] Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.582647 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ae39c8a-af64-489c-bb82-27c32c21279e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.582849 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ae39c8a-af64-489c-bb82-27c32c21279e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.584073 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.584146 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.584200 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.584261 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7l9d\" (UniqueName: \"kubernetes.io/projected/7ae39c8a-af64-489c-bb82-27c32c21279e-kube-api-access-v7l9d\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.584314 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-scripts\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.621661 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7l9d\" (UniqueName: \"kubernetes.io/projected/7ae39c8a-af64-489c-bb82-27c32c21279e-kube-api-access-v7l9d\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.622502 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.632080 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-scripts\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.632715 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.658849 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.680699 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.683382 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.689786 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.691796 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f5ba72-370b-4acb-b571-5714da3c5493-logs\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.691863 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-svc\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.691899 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.692057 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-config\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.692083 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.692128 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5pqn\" (UniqueName: \"kubernetes.io/projected/85ec42c3-f313-4218-b323-55d9e5d0a78c-kube-api-access-d5pqn\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.692204 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data-custom\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.692222 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.692283 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-scripts\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.692304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93f5ba72-370b-4acb-b571-5714da3c5493-etc-machine-id\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.692324 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.692384 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dcs\" (UniqueName: \"kubernetes.io/projected/93f5ba72-370b-4acb-b571-5714da3c5493-kube-api-access-k4dcs\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.692412 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.698224 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.743543 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.759892 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794109 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data\") pod \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794164 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data-custom\") pod \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794184 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-combined-ca-bundle\") pod \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794318 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twlbn\" (UniqueName: \"kubernetes.io/projected/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-kube-api-access-twlbn\") pod \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794342 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-logs\") pod \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\" (UID: \"f0f1458c-e1f3-438a-bdfc-1801ae47a68d\") " Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794605 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93f5ba72-370b-4acb-b571-5714da3c5493-etc-machine-id\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794646 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794680 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dcs\" (UniqueName: \"kubernetes.io/projected/93f5ba72-370b-4acb-b571-5714da3c5493-kube-api-access-k4dcs\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794701 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794730 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f5ba72-370b-4acb-b571-5714da3c5493-logs\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794755 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-svc\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794785 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794815 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-config\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794832 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794850 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5pqn\" (UniqueName: \"kubernetes.io/projected/85ec42c3-f313-4218-b323-55d9e5d0a78c-kube-api-access-d5pqn\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794887 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data-custom\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794902 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.794932 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-scripts\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.795455 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93f5ba72-370b-4acb-b571-5714da3c5493-etc-machine-id\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.795874 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f5ba72-370b-4acb-b571-5714da3c5493-logs\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.797784 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.798757 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-logs" (OuterVolumeSpecName: "logs") pod "f0f1458c-e1f3-438a-bdfc-1801ae47a68d" (UID: "f0f1458c-e1f3-438a-bdfc-1801ae47a68d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.799273 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.799349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-config\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.801318 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-svc\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.801345 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.808922 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f0f1458c-e1f3-438a-bdfc-1801ae47a68d" (UID: "f0f1458c-e1f3-438a-bdfc-1801ae47a68d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.818039 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-kube-api-access-twlbn" (OuterVolumeSpecName: "kube-api-access-twlbn") pod "f0f1458c-e1f3-438a-bdfc-1801ae47a68d" (UID: "f0f1458c-e1f3-438a-bdfc-1801ae47a68d"). InnerVolumeSpecName "kube-api-access-twlbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.818841 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.819355 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.821607 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5pqn\" (UniqueName: \"kubernetes.io/projected/85ec42c3-f313-4218-b323-55d9e5d0a78c-kube-api-access-d5pqn\") pod \"dnsmasq-dns-5784cf869f-nft57\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.823280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-scripts\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.825220 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data-custom\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.840804 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dcs\" (UniqueName: \"kubernetes.io/projected/93f5ba72-370b-4acb-b571-5714da3c5493-kube-api-access-k4dcs\") pod \"cinder-api-0\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " pod="openstack/cinder-api-0" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.847074 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0f1458c-e1f3-438a-bdfc-1801ae47a68d" (UID: "f0f1458c-e1f3-438a-bdfc-1801ae47a68d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.891531 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.903377 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.903407 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.903416 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twlbn\" (UniqueName: \"kubernetes.io/projected/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-kube-api-access-twlbn\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.903428 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:31 crc kubenswrapper[4763]: I1006 15:12:31.927237 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data" (OuterVolumeSpecName: "config-data") pod "f0f1458c-e1f3-438a-bdfc-1801ae47a68d" (UID: "f0f1458c-e1f3-438a-bdfc-1801ae47a68d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.004759 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f1458c-e1f3-438a-bdfc-1801ae47a68d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.049373 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.169219 4763 generic.go:334] "Generic (PLEG): container finished" podID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerID="c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3" exitCode=0 Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.169510 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66c6546c98-bp5zs" Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.171608 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66c6546c98-bp5zs" event={"ID":"f0f1458c-e1f3-438a-bdfc-1801ae47a68d","Type":"ContainerDied","Data":"c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3"} Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.171663 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66c6546c98-bp5zs" event={"ID":"f0f1458c-e1f3-438a-bdfc-1801ae47a68d","Type":"ContainerDied","Data":"349baaf0f0538db7216713ead5f602e0698ea2e99b75a503a2ddcd8e6210537e"} Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.171688 4763 scope.go:117] "RemoveContainer" containerID="c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3" Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.249453 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-66c6546c98-bp5zs"] Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.259273 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-66c6546c98-bp5zs"] Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.266602 4763 scope.go:117] "RemoveContainer" containerID="b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095" Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.281990 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.317100 4763 scope.go:117] "RemoveContainer" containerID="c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3" Oct 06 15:12:32 crc kubenswrapper[4763]: E1006 15:12:32.317703 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3\": container with ID starting with c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3 not found: ID does not exist" containerID="c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3" Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.317746 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3"} err="failed to get container status \"c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3\": rpc error: code = NotFound desc = could not find container \"c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3\": container with ID starting with c3fe2f8fd295cf9115fc5d9e10f60c891d77c929025f2f98f2e18e6c41fac9d3 not found: ID does not exist" Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.317774 4763 scope.go:117] "RemoveContainer" containerID="b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095" Oct 06 15:12:32 crc kubenswrapper[4763]: E1006 15:12:32.318112 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095\": container with ID starting with b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095 not found: ID does not exist" containerID="b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095" Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.318161 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095"} err="failed to get container status \"b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095\": rpc error: code = NotFound desc = could not find container \"b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095\": container with ID starting with b15a950b0385d247f4bdaf387985e86c6cb011d24880b31f9fed62f9da233095 not found: ID does not exist" Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.492114 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-nft57"] Oct 06 15:12:32 crc kubenswrapper[4763]: W1006 15:12:32.650393 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93f5ba72_370b_4acb_b571_5714da3c5493.slice/crio-a040b9e3251d31ffdde6200b3175e6f214e07b93f83d9ed37a1172d9e37ec1dd WatchSource:0}: Error finding container a040b9e3251d31ffdde6200b3175e6f214e07b93f83d9ed37a1172d9e37ec1dd: Status 404 returned error can't find the container with id a040b9e3251d31ffdde6200b3175e6f214e07b93f83d9ed37a1172d9e37ec1dd Oct 06 15:12:32 crc kubenswrapper[4763]: I1006 15:12:32.651366 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.142889 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.202228 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7ae39c8a-af64-489c-bb82-27c32c21279e","Type":"ContainerStarted","Data":"8596860edd134df877602b1da440e2bc843a3717c9ff30d9c2f0ff768a3d38a0"} Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.203920 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"93f5ba72-370b-4acb-b571-5714da3c5493","Type":"ContainerStarted","Data":"a040b9e3251d31ffdde6200b3175e6f214e07b93f83d9ed37a1172d9e37ec1dd"} Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.208237 4763 generic.go:334] "Generic (PLEG): container finished" podID="85ec42c3-f313-4218-b323-55d9e5d0a78c" containerID="791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80" exitCode=0 Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.208354 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-nft57" event={"ID":"85ec42c3-f313-4218-b323-55d9e5d0a78c","Type":"ContainerDied","Data":"791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80"} Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.208384 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-nft57" event={"ID":"85ec42c3-f313-4218-b323-55d9e5d0a78c","Type":"ContainerStarted","Data":"7f9b585cdc682976821929e9f5c8ffc13c10faaf582163293868b6711960e4c5"} Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.208481 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="ceilometer-central-agent" containerID="cri-o://91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194" gracePeriod=30 Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.208585 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="proxy-httpd" containerID="cri-o://1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805" gracePeriod=30 Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.208655 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="sg-core" containerID="cri-o://478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7" gracePeriod=30 Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.208691 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="ceilometer-notification-agent" containerID="cri-o://923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5" gracePeriod=30 Oct 06 15:12:33 crc kubenswrapper[4763]: I1006 15:12:33.588749 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" path="/var/lib/kubelet/pods/f0f1458c-e1f3-438a-bdfc-1801ae47a68d/volumes" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.124779 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5ffc68c745-rgzs7"] Oct 06 15:12:34 crc kubenswrapper[4763]: E1006 15:12:34.125550 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerName="barbican-api-log" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.125568 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerName="barbican-api-log" Oct 06 15:12:34 crc kubenswrapper[4763]: E1006 15:12:34.125595 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerName="barbican-api" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.125605 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerName="barbican-api" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.125787 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerName="barbican-api" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.125799 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f1458c-e1f3-438a-bdfc-1801ae47a68d" containerName="barbican-api-log" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.127836 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.132706 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.132948 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.133050 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.180546 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5ffc68c745-rgzs7"] Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.204906 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.239757 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-nft57" event={"ID":"85ec42c3-f313-4218-b323-55d9e5d0a78c","Type":"ContainerStarted","Data":"dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553"} Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.240034 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.245794 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"93f5ba72-370b-4acb-b571-5714da3c5493","Type":"ContainerStarted","Data":"6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02"} Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254098 4763 generic.go:334] "Generic (PLEG): container finished" podID="ed442e9b-5025-42ba-b723-3da614839bde" containerID="1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805" exitCode=0 Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254135 4763 generic.go:334] "Generic (PLEG): container finished" podID="ed442e9b-5025-42ba-b723-3da614839bde" containerID="478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7" exitCode=2 Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254147 4763 generic.go:334] "Generic (PLEG): container finished" podID="ed442e9b-5025-42ba-b723-3da614839bde" containerID="923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5" exitCode=0 Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254158 4763 generic.go:334] "Generic (PLEG): container finished" podID="ed442e9b-5025-42ba-b723-3da614839bde" containerID="91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194" exitCode=0 Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254181 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed442e9b-5025-42ba-b723-3da614839bde","Type":"ContainerDied","Data":"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805"} Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254240 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed442e9b-5025-42ba-b723-3da614839bde","Type":"ContainerDied","Data":"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7"} Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254257 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed442e9b-5025-42ba-b723-3da614839bde","Type":"ContainerDied","Data":"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5"} Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254269 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed442e9b-5025-42ba-b723-3da614839bde","Type":"ContainerDied","Data":"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194"} Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed442e9b-5025-42ba-b723-3da614839bde","Type":"ContainerDied","Data":"00f599a06447fcb70ae4a790bb85a06b2cf4fa4dbc3d86106bb61db98be1d6a5"} Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254297 4763 scope.go:117] "RemoveContainer" containerID="1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.254434 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.259235 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-log-httpd\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.259272 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-combined-ca-bundle\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.259304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-config-data\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.259355 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-internal-tls-certs\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.260235 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-public-tls-certs\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.260337 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-run-httpd\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.260456 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-etc-swift\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.260504 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmknc\" (UniqueName: \"kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-kube-api-access-kmknc\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.276399 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-nft57" podStartSLOduration=3.276381969 podStartE2EDuration="3.276381969s" podCreationTimestamp="2025-10-06 15:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:34.261547301 +0000 UTC m=+1151.416839843" watchObservedRunningTime="2025-10-06 15:12:34.276381969 +0000 UTC m=+1151.431674481" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.312850 4763 scope.go:117] "RemoveContainer" containerID="478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.347199 4763 scope.go:117] "RemoveContainer" containerID="923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.366335 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-scripts\") pod \"ed442e9b-5025-42ba-b723-3da614839bde\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.366404 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmts9\" (UniqueName: \"kubernetes.io/projected/ed442e9b-5025-42ba-b723-3da614839bde-kube-api-access-jmts9\") pod \"ed442e9b-5025-42ba-b723-3da614839bde\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.366444 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-config-data\") pod \"ed442e9b-5025-42ba-b723-3da614839bde\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.366493 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-combined-ca-bundle\") pod \"ed442e9b-5025-42ba-b723-3da614839bde\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.366534 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-sg-core-conf-yaml\") pod \"ed442e9b-5025-42ba-b723-3da614839bde\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.366644 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-log-httpd\") pod \"ed442e9b-5025-42ba-b723-3da614839bde\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.366677 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-run-httpd\") pod \"ed442e9b-5025-42ba-b723-3da614839bde\" (UID: \"ed442e9b-5025-42ba-b723-3da614839bde\") " Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.367027 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-public-tls-certs\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.367115 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-run-httpd\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.367203 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-etc-swift\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.367249 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmknc\" (UniqueName: \"kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-kube-api-access-kmknc\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.367332 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-log-httpd\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.367372 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-combined-ca-bundle\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.367400 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-config-data\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.367440 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-internal-tls-certs\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.369606 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed442e9b-5025-42ba-b723-3da614839bde" (UID: "ed442e9b-5025-42ba-b723-3da614839bde"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.371327 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-log-httpd\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.371880 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-run-httpd\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.375495 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-scripts" (OuterVolumeSpecName: "scripts") pod "ed442e9b-5025-42ba-b723-3da614839bde" (UID: "ed442e9b-5025-42ba-b723-3da614839bde"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.382208 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed442e9b-5025-42ba-b723-3da614839bde" (UID: "ed442e9b-5025-42ba-b723-3da614839bde"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.395777 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed442e9b-5025-42ba-b723-3da614839bde-kube-api-access-jmts9" (OuterVolumeSpecName: "kube-api-access-jmts9") pod "ed442e9b-5025-42ba-b723-3da614839bde" (UID: "ed442e9b-5025-42ba-b723-3da614839bde"). InnerVolumeSpecName "kube-api-access-jmts9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.396483 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-public-tls-certs\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.396601 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-combined-ca-bundle\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.397397 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-etc-swift\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.398038 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-internal-tls-certs\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.398964 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-config-data\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.402809 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmknc\" (UniqueName: \"kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-kube-api-access-kmknc\") pod \"swift-proxy-5ffc68c745-rgzs7\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.451758 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed442e9b-5025-42ba-b723-3da614839bde" (UID: "ed442e9b-5025-42ba-b723-3da614839bde"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.469873 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.470142 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed442e9b-5025-42ba-b723-3da614839bde-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.470205 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.470264 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmts9\" (UniqueName: \"kubernetes.io/projected/ed442e9b-5025-42ba-b723-3da614839bde-kube-api-access-jmts9\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.470406 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.503349 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed442e9b-5025-42ba-b723-3da614839bde" (UID: "ed442e9b-5025-42ba-b723-3da614839bde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.513229 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.525957 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-config-data" (OuterVolumeSpecName: "config-data") pod "ed442e9b-5025-42ba-b723-3da614839bde" (UID: "ed442e9b-5025-42ba-b723-3da614839bde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.532773 4763 scope.go:117] "RemoveContainer" containerID="91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.548812 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.569771 4763 scope.go:117] "RemoveContainer" containerID="1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805" Oct 06 15:12:34 crc kubenswrapper[4763]: E1006 15:12:34.570209 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805\": container with ID starting with 1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805 not found: ID does not exist" containerID="1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.570244 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805"} err="failed to get container status \"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805\": rpc error: code = NotFound desc = could not find container \"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805\": container with ID starting with 1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.570270 4763 scope.go:117] "RemoveContainer" containerID="478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7" Oct 06 15:12:34 crc kubenswrapper[4763]: E1006 15:12:34.570520 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7\": container with ID starting with 478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7 not found: ID does not exist" containerID="478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.570544 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7"} err="failed to get container status \"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7\": rpc error: code = NotFound desc = could not find container \"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7\": container with ID starting with 478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.570559 4763 scope.go:117] "RemoveContainer" containerID="923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.571273 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.571299 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed442e9b-5025-42ba-b723-3da614839bde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:34 crc kubenswrapper[4763]: E1006 15:12:34.571542 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5\": container with ID starting with 923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5 not found: ID does not exist" containerID="923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.571571 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5"} err="failed to get container status \"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5\": rpc error: code = NotFound desc = could not find container \"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5\": container with ID starting with 923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.571588 4763 scope.go:117] "RemoveContainer" containerID="91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194" Oct 06 15:12:34 crc kubenswrapper[4763]: E1006 15:12:34.571890 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194\": container with ID starting with 91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194 not found: ID does not exist" containerID="91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.571910 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194"} err="failed to get container status \"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194\": rpc error: code = NotFound desc = could not find container \"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194\": container with ID starting with 91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.571928 4763 scope.go:117] "RemoveContainer" containerID="1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.572274 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805"} err="failed to get container status \"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805\": rpc error: code = NotFound desc = could not find container \"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805\": container with ID starting with 1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.572300 4763 scope.go:117] "RemoveContainer" containerID="478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.572888 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7"} err="failed to get container status \"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7\": rpc error: code = NotFound desc = could not find container \"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7\": container with ID starting with 478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.572951 4763 scope.go:117] "RemoveContainer" containerID="923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.573280 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5"} err="failed to get container status \"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5\": rpc error: code = NotFound desc = could not find container \"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5\": container with ID starting with 923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.573304 4763 scope.go:117] "RemoveContainer" containerID="91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.573541 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194"} err="failed to get container status \"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194\": rpc error: code = NotFound desc = could not find container \"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194\": container with ID starting with 91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.573564 4763 scope.go:117] "RemoveContainer" containerID="1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.573862 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805"} err="failed to get container status \"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805\": rpc error: code = NotFound desc = could not find container \"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805\": container with ID starting with 1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.573887 4763 scope.go:117] "RemoveContainer" containerID="478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.574780 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7"} err="failed to get container status \"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7\": rpc error: code = NotFound desc = could not find container \"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7\": container with ID starting with 478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.574805 4763 scope.go:117] "RemoveContainer" containerID="923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.581479 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5"} err="failed to get container status \"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5\": rpc error: code = NotFound desc = could not find container \"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5\": container with ID starting with 923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.582808 4763 scope.go:117] "RemoveContainer" containerID="91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.604785 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.625848 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194"} err="failed to get container status \"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194\": rpc error: code = NotFound desc = could not find container \"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194\": container with ID starting with 91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.625910 4763 scope.go:117] "RemoveContainer" containerID="1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.627315 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805"} err="failed to get container status \"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805\": rpc error: code = NotFound desc = could not find container \"1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805\": container with ID starting with 1768e3521ac2428cd422e1eb366f276c5aaf2c236c38e7be438b854316cc6805 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.627358 4763 scope.go:117] "RemoveContainer" containerID="478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.627605 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7"} err="failed to get container status \"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7\": rpc error: code = NotFound desc = could not find container \"478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7\": container with ID starting with 478dcb9e387d67653e3964cd76f4b36a03430b148f21e21172338d19238c71d7 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.627700 4763 scope.go:117] "RemoveContainer" containerID="923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.627970 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5"} err="failed to get container status \"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5\": rpc error: code = NotFound desc = could not find container \"923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5\": container with ID starting with 923b5e7fdd90695ca3c5b7cb142169a2b063deda3030cad9c1126748bc079fd5 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.628005 4763 scope.go:117] "RemoveContainer" containerID="91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.628162 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194"} err="failed to get container status \"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194\": rpc error: code = NotFound desc = could not find container \"91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194\": container with ID starting with 91c684e0f60156b65e898a235846e7012e93a13fa891285da3ad2dc69bf75194 not found: ID does not exist" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.630660 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.661364 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:34 crc kubenswrapper[4763]: E1006 15:12:34.661897 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="ceilometer-notification-agent" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.661915 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="ceilometer-notification-agent" Oct 06 15:12:34 crc kubenswrapper[4763]: E1006 15:12:34.661928 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="sg-core" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.661935 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="sg-core" Oct 06 15:12:34 crc kubenswrapper[4763]: E1006 15:12:34.661957 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="ceilometer-central-agent" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.661966 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="ceilometer-central-agent" Oct 06 15:12:34 crc kubenswrapper[4763]: E1006 15:12:34.661997 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="proxy-httpd" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.662006 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="proxy-httpd" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.662219 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="ceilometer-notification-agent" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.662272 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="ceilometer-central-agent" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.662293 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="sg-core" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.662305 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed442e9b-5025-42ba-b723-3da614839bde" containerName="proxy-httpd" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.666658 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.666886 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.672031 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.672223 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.673731 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-scripts\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.673768 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-config-data\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.673785 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7grh9\" (UniqueName: \"kubernetes.io/projected/4e3aa366-1377-4a4f-b943-c0cb15cc3391-kube-api-access-7grh9\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.673828 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.673847 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.673879 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-run-httpd\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.673952 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-log-httpd\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.776481 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.776523 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.776583 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-run-httpd\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.776806 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-log-httpd\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.776863 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-scripts\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.776909 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-config-data\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.776933 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7grh9\" (UniqueName: \"kubernetes.io/projected/4e3aa366-1377-4a4f-b943-c0cb15cc3391-kube-api-access-7grh9\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.780273 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-run-httpd\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.780290 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-log-httpd\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.790114 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.793076 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-config-data\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.793463 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-scripts\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.794197 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:34 crc kubenswrapper[4763]: I1006 15:12:34.807746 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7grh9\" (UniqueName: \"kubernetes.io/projected/4e3aa366-1377-4a4f-b943-c0cb15cc3391-kube-api-access-7grh9\") pod \"ceilometer-0\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " pod="openstack/ceilometer-0" Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.009012 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.248273 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5ffc68c745-rgzs7"] Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.313249 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7ae39c8a-af64-489c-bb82-27c32c21279e","Type":"ContainerStarted","Data":"b31661be98b8240c26143e8f17e906efe34924a310dad31c6977f96d6065437a"} Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.318313 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffc68c745-rgzs7" event={"ID":"0ff0676d-7674-42db-a71e-bba83d7e093e","Type":"ContainerStarted","Data":"169b16272c345969b09cd477d7f57f66ee6b66d73d67b1a449ca18b63b071d1e"} Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.331951 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"93f5ba72-370b-4acb-b571-5714da3c5493","Type":"ContainerStarted","Data":"ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640"} Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.332374 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.340713 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.390262657 podStartE2EDuration="4.34069183s" podCreationTimestamp="2025-10-06 15:12:31 +0000 UTC" firstStartedPulling="2025-10-06 15:12:32.325880696 +0000 UTC m=+1149.481173208" lastFinishedPulling="2025-10-06 15:12:33.276309879 +0000 UTC m=+1150.431602381" observedRunningTime="2025-10-06 15:12:35.336383061 +0000 UTC m=+1152.491675573" watchObservedRunningTime="2025-10-06 15:12:35.34069183 +0000 UTC m=+1152.495984342" Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.367510 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.367490178 podStartE2EDuration="4.367490178s" podCreationTimestamp="2025-10-06 15:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:35.359298483 +0000 UTC m=+1152.514590995" watchObservedRunningTime="2025-10-06 15:12:35.367490178 +0000 UTC m=+1152.522782690" Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.534344 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.597154 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed442e9b-5025-42ba-b723-3da614839bde" path="/var/lib/kubelet/pods/ed442e9b-5025-42ba-b723-3da614839bde/volumes" Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.838947 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.839260 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" containerName="glance-log" containerID="cri-o://0890ad246b756974b6b74972304dcf73498cc71646d0dabba0480b45f1736aab" gracePeriod=30 Oct 06 15:12:35 crc kubenswrapper[4763]: I1006 15:12:35.839306 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" containerName="glance-httpd" containerID="cri-o://aa5d91c1e7400c99fe72de25d10b1fc6f93eb69a791f36f8cd9a5adaaa5ff2aa" gracePeriod=30 Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.359701 4763 generic.go:334] "Generic (PLEG): container finished" podID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" containerID="0890ad246b756974b6b74972304dcf73498cc71646d0dabba0480b45f1736aab" exitCode=143 Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.359751 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f","Type":"ContainerDied","Data":"0890ad246b756974b6b74972304dcf73498cc71646d0dabba0480b45f1736aab"} Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.367779 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7ae39c8a-af64-489c-bb82-27c32c21279e","Type":"ContainerStarted","Data":"003ec72ca85190eed530bd3343625c7301e38e1d324d8359cab935c10a00ca6b"} Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.375416 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e3aa366-1377-4a4f-b943-c0cb15cc3391","Type":"ContainerStarted","Data":"66500ad66fcb09340039f99fbf80f972e215374b5c41fd307720621642c1de75"} Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.375460 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e3aa366-1377-4a4f-b943-c0cb15cc3391","Type":"ContainerStarted","Data":"1bfb2c7e3dc4958e5da11c7e4360d7211a220cfa1d99be7d9f120b0fa67fa775"} Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.381235 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="93f5ba72-370b-4acb-b571-5714da3c5493" containerName="cinder-api-log" containerID="cri-o://6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02" gracePeriod=30 Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.382010 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffc68c745-rgzs7" event={"ID":"0ff0676d-7674-42db-a71e-bba83d7e093e","Type":"ContainerStarted","Data":"59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135"} Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.382063 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffc68c745-rgzs7" event={"ID":"0ff0676d-7674-42db-a71e-bba83d7e093e","Type":"ContainerStarted","Data":"d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef"} Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.382378 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="93f5ba72-370b-4acb-b571-5714da3c5493" containerName="cinder-api" containerID="cri-o://ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640" gracePeriod=30 Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.382420 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.382456 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.410399 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5ffc68c745-rgzs7" podStartSLOduration=2.410374409 podStartE2EDuration="2.410374409s" podCreationTimestamp="2025-10-06 15:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:36.404375454 +0000 UTC m=+1153.559667966" watchObservedRunningTime="2025-10-06 15:12:36.410374409 +0000 UTC m=+1153.565666921" Oct 06 15:12:36 crc kubenswrapper[4763]: I1006 15:12:36.761195 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.309299 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.394247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e3aa366-1377-4a4f-b943-c0cb15cc3391","Type":"ContainerStarted","Data":"564a9b0f4e9fd4afbc67d841889eef92c3f966da86abe49d6f127aceda3fe5cd"} Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.395848 4763 generic.go:334] "Generic (PLEG): container finished" podID="93f5ba72-370b-4acb-b571-5714da3c5493" containerID="ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640" exitCode=0 Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.395870 4763 generic.go:334] "Generic (PLEG): container finished" podID="93f5ba72-370b-4acb-b571-5714da3c5493" containerID="6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02" exitCode=143 Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.396655 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.397080 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"93f5ba72-370b-4acb-b571-5714da3c5493","Type":"ContainerDied","Data":"ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640"} Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.397108 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"93f5ba72-370b-4acb-b571-5714da3c5493","Type":"ContainerDied","Data":"6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02"} Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.397121 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"93f5ba72-370b-4acb-b571-5714da3c5493","Type":"ContainerDied","Data":"a040b9e3251d31ffdde6200b3175e6f214e07b93f83d9ed37a1172d9e37ec1dd"} Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.397138 4763 scope.go:117] "RemoveContainer" containerID="ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.432922 4763 scope.go:117] "RemoveContainer" containerID="6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.436103 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-scripts\") pod \"93f5ba72-370b-4acb-b571-5714da3c5493\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.436220 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93f5ba72-370b-4acb-b571-5714da3c5493-etc-machine-id\") pod \"93f5ba72-370b-4acb-b571-5714da3c5493\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.436251 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f5ba72-370b-4acb-b571-5714da3c5493-logs\") pod \"93f5ba72-370b-4acb-b571-5714da3c5493\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.436285 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-combined-ca-bundle\") pod \"93f5ba72-370b-4acb-b571-5714da3c5493\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.436334 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data\") pod \"93f5ba72-370b-4acb-b571-5714da3c5493\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.436377 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4dcs\" (UniqueName: \"kubernetes.io/projected/93f5ba72-370b-4acb-b571-5714da3c5493-kube-api-access-k4dcs\") pod \"93f5ba72-370b-4acb-b571-5714da3c5493\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.436440 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data-custom\") pod \"93f5ba72-370b-4acb-b571-5714da3c5493\" (UID: \"93f5ba72-370b-4acb-b571-5714da3c5493\") " Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.437152 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93f5ba72-370b-4acb-b571-5714da3c5493-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "93f5ba72-370b-4acb-b571-5714da3c5493" (UID: "93f5ba72-370b-4acb-b571-5714da3c5493"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.437651 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f5ba72-370b-4acb-b571-5714da3c5493-logs" (OuterVolumeSpecName: "logs") pod "93f5ba72-370b-4acb-b571-5714da3c5493" (UID: "93f5ba72-370b-4acb-b571-5714da3c5493"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.448804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "93f5ba72-370b-4acb-b571-5714da3c5493" (UID: "93f5ba72-370b-4acb-b571-5714da3c5493"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.448904 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f5ba72-370b-4acb-b571-5714da3c5493-kube-api-access-k4dcs" (OuterVolumeSpecName: "kube-api-access-k4dcs") pod "93f5ba72-370b-4acb-b571-5714da3c5493" (UID: "93f5ba72-370b-4acb-b571-5714da3c5493"). InnerVolumeSpecName "kube-api-access-k4dcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.465803 4763 scope.go:117] "RemoveContainer" containerID="ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.466181 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-scripts" (OuterVolumeSpecName: "scripts") pod "93f5ba72-370b-4acb-b571-5714da3c5493" (UID: "93f5ba72-370b-4acb-b571-5714da3c5493"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:37 crc kubenswrapper[4763]: E1006 15:12:37.466343 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640\": container with ID starting with ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640 not found: ID does not exist" containerID="ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.466378 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640"} err="failed to get container status \"ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640\": rpc error: code = NotFound desc = could not find container \"ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640\": container with ID starting with ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640 not found: ID does not exist" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.466419 4763 scope.go:117] "RemoveContainer" containerID="6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02" Oct 06 15:12:37 crc kubenswrapper[4763]: E1006 15:12:37.467335 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02\": container with ID starting with 6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02 not found: ID does not exist" containerID="6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.467373 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02"} err="failed to get container status \"6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02\": rpc error: code = NotFound desc = could not find container \"6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02\": container with ID starting with 6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02 not found: ID does not exist" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.467400 4763 scope.go:117] "RemoveContainer" containerID="ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.467631 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640"} err="failed to get container status \"ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640\": rpc error: code = NotFound desc = could not find container \"ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640\": container with ID starting with ff03bdb94b8bd96c94e9b0961326e3ac05c3bc6c9f860c1df02a8a67ce3a4640 not found: ID does not exist" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.467650 4763 scope.go:117] "RemoveContainer" containerID="6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.467862 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02"} err="failed to get container status \"6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02\": rpc error: code = NotFound desc = could not find container \"6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02\": container with ID starting with 6af7faefc7cd7a6e49c53b3afa582fc60e4e1bd0311a731a22fc0160a33b0a02 not found: ID does not exist" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.491591 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93f5ba72-370b-4acb-b571-5714da3c5493" (UID: "93f5ba72-370b-4acb-b571-5714da3c5493"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.508895 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data" (OuterVolumeSpecName: "config-data") pod "93f5ba72-370b-4acb-b571-5714da3c5493" (UID: "93f5ba72-370b-4acb-b571-5714da3c5493"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.538798 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.538836 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4dcs\" (UniqueName: \"kubernetes.io/projected/93f5ba72-370b-4acb-b571-5714da3c5493-kube-api-access-k4dcs\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.538848 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.538858 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.538868 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93f5ba72-370b-4acb-b571-5714da3c5493-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.538880 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f5ba72-370b-4acb-b571-5714da3c5493-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.538890 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f5ba72-370b-4acb-b571-5714da3c5493-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.718220 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.725300 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.733238 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:12:37 crc kubenswrapper[4763]: E1006 15:12:37.733639 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f5ba72-370b-4acb-b571-5714da3c5493" containerName="cinder-api-log" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.733656 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f5ba72-370b-4acb-b571-5714da3c5493" containerName="cinder-api-log" Oct 06 15:12:37 crc kubenswrapper[4763]: E1006 15:12:37.733672 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f5ba72-370b-4acb-b571-5714da3c5493" containerName="cinder-api" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.733678 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f5ba72-370b-4acb-b571-5714da3c5493" containerName="cinder-api" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.733881 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f5ba72-370b-4acb-b571-5714da3c5493" containerName="cinder-api" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.733920 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f5ba72-370b-4acb-b571-5714da3c5493" containerName="cinder-api-log" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.734862 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.737679 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.738082 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.738382 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.760388 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.842254 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b750fb-21cc-4a04-ba58-bddcbc2161e7-logs\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.842299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data-custom\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.842350 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.842403 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.842426 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20b750fb-21cc-4a04-ba58-bddcbc2161e7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.842448 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.842491 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlkj2\" (UniqueName: \"kubernetes.io/projected/20b750fb-21cc-4a04-ba58-bddcbc2161e7-kube-api-access-vlkj2\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.842549 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.842594 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-scripts\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.945162 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.945541 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.945564 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20b750fb-21cc-4a04-ba58-bddcbc2161e7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.945581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.945667 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlkj2\" (UniqueName: \"kubernetes.io/projected/20b750fb-21cc-4a04-ba58-bddcbc2161e7-kube-api-access-vlkj2\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.945689 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.945737 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-scripts\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.945766 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b750fb-21cc-4a04-ba58-bddcbc2161e7-logs\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.945786 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data-custom\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.946097 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20b750fb-21cc-4a04-ba58-bddcbc2161e7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.946580 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b750fb-21cc-4a04-ba58-bddcbc2161e7-logs\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.951676 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-scripts\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.952218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data-custom\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.956378 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.957588 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.958256 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.959349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:37 crc kubenswrapper[4763]: I1006 15:12:37.962154 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlkj2\" (UniqueName: \"kubernetes.io/projected/20b750fb-21cc-4a04-ba58-bddcbc2161e7-kube-api-access-vlkj2\") pod \"cinder-api-0\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " pod="openstack/cinder-api-0" Oct 06 15:12:38 crc kubenswrapper[4763]: I1006 15:12:38.074830 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 15:12:38 crc kubenswrapper[4763]: I1006 15:12:38.516437 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:12:38 crc kubenswrapper[4763]: W1006 15:12:38.522144 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20b750fb_21cc_4a04_ba58_bddcbc2161e7.slice/crio-7cc312f3062db4c2e6a51b852482e092bc8da0844dc86203230af8ace5e49e52 WatchSource:0}: Error finding container 7cc312f3062db4c2e6a51b852482e092bc8da0844dc86203230af8ace5e49e52: Status 404 returned error can't find the container with id 7cc312f3062db4c2e6a51b852482e092bc8da0844dc86203230af8ace5e49e52 Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.055240 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.433931 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e3aa366-1377-4a4f-b943-c0cb15cc3391","Type":"ContainerStarted","Data":"aa6e7f6eb29514758c76df3df76faa9c7acb5e02d460158a0c98a8c23f62ff28"} Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.436127 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20b750fb-21cc-4a04-ba58-bddcbc2161e7","Type":"ContainerStarted","Data":"33de3bfd1cb8b6e84aac16172cd6d4816d4bbcc99bd594f7601b9f6bae6bc708"} Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.436167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20b750fb-21cc-4a04-ba58-bddcbc2161e7","Type":"ContainerStarted","Data":"7cc312f3062db4c2e6a51b852482e092bc8da0844dc86203230af8ace5e49e52"} Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.448080 4763 generic.go:334] "Generic (PLEG): container finished" podID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" containerID="aa5d91c1e7400c99fe72de25d10b1fc6f93eb69a791f36f8cd9a5adaaa5ff2aa" exitCode=0 Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.448156 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f","Type":"ContainerDied","Data":"aa5d91c1e7400c99fe72de25d10b1fc6f93eb69a791f36f8cd9a5adaaa5ff2aa"} Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.548760 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.611307 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f5ba72-370b-4acb-b571-5714da3c5493" path="/var/lib/kubelet/pods/93f5ba72-370b-4acb-b571-5714da3c5493/volumes" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.698313 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-config-data\") pod \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.698368 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.698542 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-internal-tls-certs\") pod \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.698573 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-combined-ca-bundle\") pod \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.698635 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hx9w\" (UniqueName: \"kubernetes.io/projected/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-kube-api-access-7hx9w\") pod \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.698681 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-scripts\") pod \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.698724 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-httpd-run\") pod \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.698748 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-logs\") pod \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\" (UID: \"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f\") " Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.699644 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-logs" (OuterVolumeSpecName: "logs") pod "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" (UID: "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.700346 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" (UID: "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.726735 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-scripts" (OuterVolumeSpecName: "scripts") pod "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" (UID: "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.726767 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" (UID: "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.729092 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-kube-api-access-7hx9w" (OuterVolumeSpecName: "kube-api-access-7hx9w") pod "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" (UID: "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f"). InnerVolumeSpecName "kube-api-access-7hx9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.774708 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" (UID: "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.787891 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-config-data" (OuterVolumeSpecName: "config-data") pod "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" (UID: "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.792792 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" (UID: "f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.800635 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.800678 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.800691 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hx9w\" (UniqueName: \"kubernetes.io/projected/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-kube-api-access-7hx9w\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.800706 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.800718 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.800739 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.800750 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.800792 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.830061 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.902638 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.917311 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.918324 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ad043fd9-8b69-43b9-a155-1e395d0f4685" containerName="glance-log" containerID="cri-o://cf6dde839e85f0a737f41c91e1e0200462667c5c36ba626cfbd076b18ca61db2" gracePeriod=30 Oct 06 15:12:39 crc kubenswrapper[4763]: I1006 15:12:39.918809 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ad043fd9-8b69-43b9-a155-1e395d0f4685" containerName="glance-httpd" containerID="cri-o://aca9843bff84c3cd1eed81cc10cd52d2a1fe3c06e4bcd6a0b65fd16efdfe4813" gracePeriod=30 Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.440415 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.513320 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20b750fb-21cc-4a04-ba58-bddcbc2161e7","Type":"ContainerStarted","Data":"6e23f0e5ee7363ef074a84ef0fdf33f87ae71f9dfadd96d75eeac46f6868a70c"} Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.514408 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.516604 4763 generic.go:334] "Generic (PLEG): container finished" podID="ad043fd9-8b69-43b9-a155-1e395d0f4685" containerID="cf6dde839e85f0a737f41c91e1e0200462667c5c36ba626cfbd076b18ca61db2" exitCode=143 Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.516705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad043fd9-8b69-43b9-a155-1e395d0f4685","Type":"ContainerDied","Data":"cf6dde839e85f0a737f41c91e1e0200462667c5c36ba626cfbd076b18ca61db2"} Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.525326 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f","Type":"ContainerDied","Data":"f1877118d0bd6a54fa7da026217e396e77f26e332b2b8e18fde8e512bb4b89f2"} Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.525420 4763 scope.go:117] "RemoveContainer" containerID="aa5d91c1e7400c99fe72de25d10b1fc6f93eb69a791f36f8cd9a5adaaa5ff2aa" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.525580 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.560197 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.560177072 podStartE2EDuration="3.560177072s" podCreationTimestamp="2025-10-06 15:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:40.540918592 +0000 UTC m=+1157.696211104" watchObservedRunningTime="2025-10-06 15:12:40.560177072 +0000 UTC m=+1157.715469584" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.577945 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.594038 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.630014 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:12:40 crc kubenswrapper[4763]: E1006 15:12:40.630389 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" containerName="glance-log" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.630405 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" containerName="glance-log" Oct 06 15:12:40 crc kubenswrapper[4763]: E1006 15:12:40.630433 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" containerName="glance-httpd" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.630439 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" containerName="glance-httpd" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.630597 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" containerName="glance-httpd" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.630626 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" containerName="glance-log" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.633961 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.639280 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.639730 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.640000 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.821568 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.821628 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.821661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.821683 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-config-data\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.821965 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.822012 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6t66\" (UniqueName: \"kubernetes.io/projected/14a424ce-ef7d-4b9c-965e-b821798d3f78-kube-api-access-k6t66\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.822052 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-logs\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.822238 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-scripts\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.924159 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6t66\" (UniqueName: \"kubernetes.io/projected/14a424ce-ef7d-4b9c-965e-b821798d3f78-kube-api-access-k6t66\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.924231 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-logs\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.924260 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-scripts\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.924326 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.924364 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.924392 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.924916 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.926231 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-config-data\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.926323 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.926935 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.930071 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-scripts\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.930852 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-logs\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.931323 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.932594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.941861 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-config-data\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.944432 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6t66\" (UniqueName: \"kubernetes.io/projected/14a424ce-ef7d-4b9c-965e-b821798d3f78-kube-api-access-k6t66\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:40 crc kubenswrapper[4763]: I1006 15:12:40.972949 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " pod="openstack/glance-default-internal-api-0" Oct 06 15:12:41 crc kubenswrapper[4763]: I1006 15:12:41.257491 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:41 crc kubenswrapper[4763]: I1006 15:12:41.591966 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f" path="/var/lib/kubelet/pods/f33bede1-4e9c-4226-b8e0-9ad2bb7ad80f/volumes" Oct 06 15:12:41 crc kubenswrapper[4763]: I1006 15:12:41.894802 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:12:41 crc kubenswrapper[4763]: I1006 15:12:41.974027 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-4d6jz"] Oct 06 15:12:41 crc kubenswrapper[4763]: I1006 15:12:41.974548 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" podUID="8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" containerName="dnsmasq-dns" containerID="cri-o://6e9553bed3c4aaedc82dfaaa2230a429a9f286abaa71c5cbe7ac273500f6f912" gracePeriod=10 Oct 06 15:12:42 crc kubenswrapper[4763]: I1006 15:12:42.059453 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 15:12:42 crc kubenswrapper[4763]: I1006 15:12:42.132679 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:12:42 crc kubenswrapper[4763]: I1006 15:12:42.548038 4763 generic.go:334] "Generic (PLEG): container finished" podID="8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" containerID="6e9553bed3c4aaedc82dfaaa2230a429a9f286abaa71c5cbe7ac273500f6f912" exitCode=0 Oct 06 15:12:42 crc kubenswrapper[4763]: I1006 15:12:42.548135 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" event={"ID":"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb","Type":"ContainerDied","Data":"6e9553bed3c4aaedc82dfaaa2230a429a9f286abaa71c5cbe7ac273500f6f912"} Oct 06 15:12:42 crc kubenswrapper[4763]: I1006 15:12:42.548467 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7ae39c8a-af64-489c-bb82-27c32c21279e" containerName="cinder-scheduler" containerID="cri-o://b31661be98b8240c26143e8f17e906efe34924a310dad31c6977f96d6065437a" gracePeriod=30 Oct 06 15:12:42 crc kubenswrapper[4763]: I1006 15:12:42.548503 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7ae39c8a-af64-489c-bb82-27c32c21279e" containerName="probe" containerID="cri-o://003ec72ca85190eed530bd3343625c7301e38e1d324d8359cab935c10a00ca6b" gracePeriod=30 Oct 06 15:12:43 crc kubenswrapper[4763]: I1006 15:12:43.034592 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:12:43 crc kubenswrapper[4763]: I1006 15:12:43.087337 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b97fbd7bd-9qwfz"] Oct 06 15:12:43 crc kubenswrapper[4763]: I1006 15:12:43.088458 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b97fbd7bd-9qwfz" podUID="611dea12-329a-4168-9647-a5fa92453712" containerName="neutron-api" containerID="cri-o://178342601b11f43b80ee57ac3844284165aca64d56c0cdd232573ba7de42e82b" gracePeriod=30 Oct 06 15:12:43 crc kubenswrapper[4763]: I1006 15:12:43.088939 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b97fbd7bd-9qwfz" podUID="611dea12-329a-4168-9647-a5fa92453712" containerName="neutron-httpd" containerID="cri-o://0f3abaa43abd980ead73f222d61067f438dce0f473043b9f6581afd21b553dc8" gracePeriod=30 Oct 06 15:12:43 crc kubenswrapper[4763]: I1006 15:12:43.559606 4763 generic.go:334] "Generic (PLEG): container finished" podID="ad043fd9-8b69-43b9-a155-1e395d0f4685" containerID="aca9843bff84c3cd1eed81cc10cd52d2a1fe3c06e4bcd6a0b65fd16efdfe4813" exitCode=0 Oct 06 15:12:43 crc kubenswrapper[4763]: I1006 15:12:43.559784 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad043fd9-8b69-43b9-a155-1e395d0f4685","Type":"ContainerDied","Data":"aca9843bff84c3cd1eed81cc10cd52d2a1fe3c06e4bcd6a0b65fd16efdfe4813"} Oct 06 15:12:43 crc kubenswrapper[4763]: I1006 15:12:43.562939 4763 generic.go:334] "Generic (PLEG): container finished" podID="611dea12-329a-4168-9647-a5fa92453712" containerID="0f3abaa43abd980ead73f222d61067f438dce0f473043b9f6581afd21b553dc8" exitCode=0 Oct 06 15:12:43 crc kubenswrapper[4763]: I1006 15:12:43.563032 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b97fbd7bd-9qwfz" event={"ID":"611dea12-329a-4168-9647-a5fa92453712","Type":"ContainerDied","Data":"0f3abaa43abd980ead73f222d61067f438dce0f473043b9f6581afd21b553dc8"} Oct 06 15:12:43 crc kubenswrapper[4763]: I1006 15:12:43.567086 4763 generic.go:334] "Generic (PLEG): container finished" podID="7ae39c8a-af64-489c-bb82-27c32c21279e" containerID="003ec72ca85190eed530bd3343625c7301e38e1d324d8359cab935c10a00ca6b" exitCode=0 Oct 06 15:12:43 crc kubenswrapper[4763]: I1006 15:12:43.567132 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7ae39c8a-af64-489c-bb82-27c32c21279e","Type":"ContainerDied","Data":"003ec72ca85190eed530bd3343625c7301e38e1d324d8359cab935c10a00ca6b"} Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.343752 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-kz6ph"] Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.345131 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kz6ph" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.356851 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kz6ph"] Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.391184 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp577\" (UniqueName: \"kubernetes.io/projected/ed100452-a1dc-4014-8f80-56a9ac00b198-kube-api-access-lp577\") pod \"nova-api-db-create-kz6ph\" (UID: \"ed100452-a1dc-4014-8f80-56a9ac00b198\") " pod="openstack/nova-api-db-create-kz6ph" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.440245 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-km4xb"] Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.441419 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-km4xb" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.468431 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-km4xb"] Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.492446 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp577\" (UniqueName: \"kubernetes.io/projected/ed100452-a1dc-4014-8f80-56a9ac00b198-kube-api-access-lp577\") pod \"nova-api-db-create-kz6ph\" (UID: \"ed100452-a1dc-4014-8f80-56a9ac00b198\") " pod="openstack/nova-api-db-create-kz6ph" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.492561 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkhtz\" (UniqueName: \"kubernetes.io/projected/ff78748e-fd0a-43e5-a9c5-35daa377da26-kube-api-access-lkhtz\") pod \"nova-cell0-db-create-km4xb\" (UID: \"ff78748e-fd0a-43e5-a9c5-35daa377da26\") " pod="openstack/nova-cell0-db-create-km4xb" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.516854 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp577\" (UniqueName: \"kubernetes.io/projected/ed100452-a1dc-4014-8f80-56a9ac00b198-kube-api-access-lp577\") pod \"nova-api-db-create-kz6ph\" (UID: \"ed100452-a1dc-4014-8f80-56a9ac00b198\") " pod="openstack/nova-api-db-create-kz6ph" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.561746 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.562992 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.596516 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhtz\" (UniqueName: \"kubernetes.io/projected/ff78748e-fd0a-43e5-a9c5-35daa377da26-kube-api-access-lkhtz\") pod \"nova-cell0-db-create-km4xb\" (UID: \"ff78748e-fd0a-43e5-a9c5-35daa377da26\") " pod="openstack/nova-cell0-db-create-km4xb" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.634857 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkhtz\" (UniqueName: \"kubernetes.io/projected/ff78748e-fd0a-43e5-a9c5-35daa377da26-kube-api-access-lkhtz\") pod \"nova-cell0-db-create-km4xb\" (UID: \"ff78748e-fd0a-43e5-a9c5-35daa377da26\") " pod="openstack/nova-cell0-db-create-km4xb" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.666731 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kz6ph" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.671257 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hf5qv"] Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.672798 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hf5qv" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.682959 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hf5qv"] Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.768553 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-km4xb" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.802094 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bnv6\" (UniqueName: \"kubernetes.io/projected/4f76244e-f9eb-428e-bd75-92eb5c6204a8-kube-api-access-2bnv6\") pod \"nova-cell1-db-create-hf5qv\" (UID: \"4f76244e-f9eb-428e-bd75-92eb5c6204a8\") " pod="openstack/nova-cell1-db-create-hf5qv" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.903735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bnv6\" (UniqueName: \"kubernetes.io/projected/4f76244e-f9eb-428e-bd75-92eb5c6204a8-kube-api-access-2bnv6\") pod \"nova-cell1-db-create-hf5qv\" (UID: \"4f76244e-f9eb-428e-bd75-92eb5c6204a8\") " pod="openstack/nova-cell1-db-create-hf5qv" Oct 06 15:12:44 crc kubenswrapper[4763]: I1006 15:12:44.920322 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bnv6\" (UniqueName: \"kubernetes.io/projected/4f76244e-f9eb-428e-bd75-92eb5c6204a8-kube-api-access-2bnv6\") pod \"nova-cell1-db-create-hf5qv\" (UID: \"4f76244e-f9eb-428e-bd75-92eb5c6204a8\") " pod="openstack/nova-cell1-db-create-hf5qv" Oct 06 15:12:45 crc kubenswrapper[4763]: I1006 15:12:45.001663 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hf5qv" Oct 06 15:12:45 crc kubenswrapper[4763]: I1006 15:12:45.425064 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" podUID="8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Oct 06 15:12:45 crc kubenswrapper[4763]: I1006 15:12:45.590174 4763 generic.go:334] "Generic (PLEG): container finished" podID="7ae39c8a-af64-489c-bb82-27c32c21279e" containerID="b31661be98b8240c26143e8f17e906efe34924a310dad31c6977f96d6065437a" exitCode=0 Oct 06 15:12:45 crc kubenswrapper[4763]: I1006 15:12:45.590263 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7ae39c8a-af64-489c-bb82-27c32c21279e","Type":"ContainerDied","Data":"b31661be98b8240c26143e8f17e906efe34924a310dad31c6977f96d6065437a"} Oct 06 15:12:46 crc kubenswrapper[4763]: I1006 15:12:46.603827 4763 generic.go:334] "Generic (PLEG): container finished" podID="611dea12-329a-4168-9647-a5fa92453712" containerID="178342601b11f43b80ee57ac3844284165aca64d56c0cdd232573ba7de42e82b" exitCode=0 Oct 06 15:12:46 crc kubenswrapper[4763]: I1006 15:12:46.604024 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b97fbd7bd-9qwfz" event={"ID":"611dea12-329a-4168-9647-a5fa92453712","Type":"ContainerDied","Data":"178342601b11f43b80ee57ac3844284165aca64d56c0cdd232573ba7de42e82b"} Oct 06 15:12:46 crc kubenswrapper[4763]: I1006 15:12:46.677859 4763 scope.go:117] "RemoveContainer" containerID="0890ad246b756974b6b74972304dcf73498cc71646d0dabba0480b45f1736aab" Oct 06 15:12:46 crc kubenswrapper[4763]: I1006 15:12:46.976171 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.045190 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-svc\") pod \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.045242 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-config\") pod \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.045276 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-sb\") pod \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.045426 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-swift-storage-0\") pod \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.045498 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-nb\") pod \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.045553 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfqn8\" (UniqueName: \"kubernetes.io/projected/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-kube-api-access-xfqn8\") pod \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\" (UID: \"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.076467 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-kube-api-access-xfqn8" (OuterVolumeSpecName: "kube-api-access-xfqn8") pod "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" (UID: "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb"). InnerVolumeSpecName "kube-api-access-xfqn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.176322 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfqn8\" (UniqueName: \"kubernetes.io/projected/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-kube-api-access-xfqn8\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.181355 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.278269 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7l9d\" (UniqueName: \"kubernetes.io/projected/7ae39c8a-af64-489c-bb82-27c32c21279e-kube-api-access-v7l9d\") pod \"7ae39c8a-af64-489c-bb82-27c32c21279e\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.278375 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ae39c8a-af64-489c-bb82-27c32c21279e-etc-machine-id\") pod \"7ae39c8a-af64-489c-bb82-27c32c21279e\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.278454 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data-custom\") pod \"7ae39c8a-af64-489c-bb82-27c32c21279e\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.278747 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data\") pod \"7ae39c8a-af64-489c-bb82-27c32c21279e\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.278772 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-combined-ca-bundle\") pod \"7ae39c8a-af64-489c-bb82-27c32c21279e\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.278835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-scripts\") pod \"7ae39c8a-af64-489c-bb82-27c32c21279e\" (UID: \"7ae39c8a-af64-489c-bb82-27c32c21279e\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.280520 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae39c8a-af64-489c-bb82-27c32c21279e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7ae39c8a-af64-489c-bb82-27c32c21279e" (UID: "7ae39c8a-af64-489c-bb82-27c32c21279e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.324396 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae39c8a-af64-489c-bb82-27c32c21279e-kube-api-access-v7l9d" (OuterVolumeSpecName: "kube-api-access-v7l9d") pod "7ae39c8a-af64-489c-bb82-27c32c21279e" (UID: "7ae39c8a-af64-489c-bb82-27c32c21279e"). InnerVolumeSpecName "kube-api-access-v7l9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.326113 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ae39c8a-af64-489c-bb82-27c32c21279e" (UID: "7ae39c8a-af64-489c-bb82-27c32c21279e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.333270 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-scripts" (OuterVolumeSpecName: "scripts") pod "7ae39c8a-af64-489c-bb82-27c32c21279e" (UID: "7ae39c8a-af64-489c-bb82-27c32c21279e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.382147 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.382205 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.382216 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7l9d\" (UniqueName: \"kubernetes.io/projected/7ae39c8a-af64-489c-bb82-27c32c21279e-kube-api-access-v7l9d\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.382226 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ae39c8a-af64-489c-bb82-27c32c21279e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.397594 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.411935 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.462746 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" (UID: "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.463315 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-config" (OuterVolumeSpecName: "config") pod "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" (UID: "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496302 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6rhw\" (UniqueName: \"kubernetes.io/projected/ad043fd9-8b69-43b9-a155-1e395d0f4685-kube-api-access-s6rhw\") pod \"ad043fd9-8b69-43b9-a155-1e395d0f4685\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496348 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-httpd-run\") pod \"ad043fd9-8b69-43b9-a155-1e395d0f4685\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496427 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-public-tls-certs\") pod \"ad043fd9-8b69-43b9-a155-1e395d0f4685\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496459 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-combined-ca-bundle\") pod \"611dea12-329a-4168-9647-a5fa92453712\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496490 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ad043fd9-8b69-43b9-a155-1e395d0f4685\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496510 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-logs\") pod \"ad043fd9-8b69-43b9-a155-1e395d0f4685\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496555 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-scripts\") pod \"ad043fd9-8b69-43b9-a155-1e395d0f4685\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496582 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-config-data\") pod \"ad043fd9-8b69-43b9-a155-1e395d0f4685\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496606 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-config\") pod \"611dea12-329a-4168-9647-a5fa92453712\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496638 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-httpd-config\") pod \"611dea12-329a-4168-9647-a5fa92453712\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496684 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-ovndb-tls-certs\") pod \"611dea12-329a-4168-9647-a5fa92453712\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496746 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-combined-ca-bundle\") pod \"ad043fd9-8b69-43b9-a155-1e395d0f4685\" (UID: \"ad043fd9-8b69-43b9-a155-1e395d0f4685\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.496764 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl7qc\" (UniqueName: \"kubernetes.io/projected/611dea12-329a-4168-9647-a5fa92453712-kube-api-access-kl7qc\") pod \"611dea12-329a-4168-9647-a5fa92453712\" (UID: \"611dea12-329a-4168-9647-a5fa92453712\") " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.497134 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.497146 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.502000 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-logs" (OuterVolumeSpecName: "logs") pod "ad043fd9-8b69-43b9-a155-1e395d0f4685" (UID: "ad043fd9-8b69-43b9-a155-1e395d0f4685"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.512260 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ad043fd9-8b69-43b9-a155-1e395d0f4685" (UID: "ad043fd9-8b69-43b9-a155-1e395d0f4685"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.529966 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kz6ph"] Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.531546 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" (UID: "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.531567 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" (UID: "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.533572 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-scripts" (OuterVolumeSpecName: "scripts") pod "ad043fd9-8b69-43b9-a155-1e395d0f4685" (UID: "ad043fd9-8b69-43b9-a155-1e395d0f4685"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.535034 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad043fd9-8b69-43b9-a155-1e395d0f4685-kube-api-access-s6rhw" (OuterVolumeSpecName: "kube-api-access-s6rhw") pod "ad043fd9-8b69-43b9-a155-1e395d0f4685" (UID: "ad043fd9-8b69-43b9-a155-1e395d0f4685"). InnerVolumeSpecName "kube-api-access-s6rhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.543126 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611dea12-329a-4168-9647-a5fa92453712-kube-api-access-kl7qc" (OuterVolumeSpecName: "kube-api-access-kl7qc") pod "611dea12-329a-4168-9647-a5fa92453712" (UID: "611dea12-329a-4168-9647-a5fa92453712"). InnerVolumeSpecName "kube-api-access-kl7qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.543663 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "ad043fd9-8b69-43b9-a155-1e395d0f4685" (UID: "ad043fd9-8b69-43b9-a155-1e395d0f4685"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.547711 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "611dea12-329a-4168-9647-a5fa92453712" (UID: "611dea12-329a-4168-9647-a5fa92453712"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.598357 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" (UID: "8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.599498 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.599521 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.599530 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.599539 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.599547 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.599555 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.599563 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl7qc\" (UniqueName: \"kubernetes.io/projected/611dea12-329a-4168-9647-a5fa92453712-kube-api-access-kl7qc\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.599574 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.599582 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6rhw\" (UniqueName: \"kubernetes.io/projected/ad043fd9-8b69-43b9-a155-1e395d0f4685-kube-api-access-s6rhw\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.599589 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad043fd9-8b69-43b9-a155-1e395d0f4685-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.654729 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.674265 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.680881 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.691962 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad043fd9-8b69-43b9-a155-1e395d0f4685" (UID: "ad043fd9-8b69-43b9-a155-1e395d0f4685"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.691975 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="ceilometer-central-agent" containerID="cri-o://66500ad66fcb09340039f99fbf80f972e215374b5c41fd307720621642c1de75" gracePeriod=30 Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.692402 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="proxy-httpd" containerID="cri-o://0cc0022d81473732b7a117b62fec511474864193b56cb8d000ea9be2a87c82b5" gracePeriod=30 Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.692496 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="ceilometer-notification-agent" containerID="cri-o://564a9b0f4e9fd4afbc67d841889eef92c3f966da86abe49d6f127aceda3fe5cd" gracePeriod=30 Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.692493 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="sg-core" containerID="cri-o://aa6e7f6eb29514758c76df3df76faa9c7acb5e02d460158a0c98a8c23f62ff28" gracePeriod=30 Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.697068 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ae39c8a-af64-489c-bb82-27c32c21279e" (UID: "7ae39c8a-af64-489c-bb82-27c32c21279e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.704237 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.719661 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.719702 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.719717 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.760763 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b97fbd7bd-9qwfz" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.766104 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ad043fd9-8b69-43b9-a155-1e395d0f4685" (UID: "ad043fd9-8b69-43b9-a155-1e395d0f4685"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.774464 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.379865199 podStartE2EDuration="13.774439638s" podCreationTimestamp="2025-10-06 15:12:34 +0000 UTC" firstStartedPulling="2025-10-06 15:12:35.55478024 +0000 UTC m=+1152.710072742" lastFinishedPulling="2025-10-06 15:12:46.949354669 +0000 UTC m=+1164.104647181" observedRunningTime="2025-10-06 15:12:47.748303027 +0000 UTC m=+1164.903595549" watchObservedRunningTime="2025-10-06 15:12:47.774439638 +0000 UTC m=+1164.929732150" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.790016 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "611dea12-329a-4168-9647-a5fa92453712" (UID: "611dea12-329a-4168-9647-a5fa92453712"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.794196 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-config-data" (OuterVolumeSpecName: "config-data") pod "ad043fd9-8b69-43b9-a155-1e395d0f4685" (UID: "ad043fd9-8b69-43b9-a155-1e395d0f4685"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.800761 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.937686209 podStartE2EDuration="18.800730812s" podCreationTimestamp="2025-10-06 15:12:29 +0000 UTC" firstStartedPulling="2025-10-06 15:12:29.98156291 +0000 UTC m=+1147.136855422" lastFinishedPulling="2025-10-06 15:12:46.844607503 +0000 UTC m=+1163.999900025" observedRunningTime="2025-10-06 15:12:47.790162531 +0000 UTC m=+1164.945455043" watchObservedRunningTime="2025-10-06 15:12:47.800730812 +0000 UTC m=+1164.956023324" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.805758 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data" (OuterVolumeSpecName: "config-data") pod "7ae39c8a-af64-489c-bb82-27c32c21279e" (UID: "7ae39c8a-af64-489c-bb82-27c32c21279e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.821878 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae39c8a-af64-489c-bb82-27c32c21279e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.821913 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.821924 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.821932 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad043fd9-8b69-43b9-a155-1e395d0f4685-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.864096 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-config" (OuterVolumeSpecName: "config") pod "611dea12-329a-4168-9647-a5fa92453712" (UID: "611dea12-329a-4168-9647-a5fa92453712"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.920536 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "611dea12-329a-4168-9647-a5fa92453712" (UID: "611dea12-329a-4168-9647-a5fa92453712"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.924322 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.924347 4763 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/611dea12-329a-4168-9647-a5fa92453712-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.949634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-4d6jz" event={"ID":"8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb","Type":"ContainerDied","Data":"c3c6cc4df4b7ee80612fc88fcc663ae8b705cbc3733755ceda1edc46d708b0f9"} Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.949926 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7ae39c8a-af64-489c-bb82-27c32c21279e","Type":"ContainerDied","Data":"8596860edd134df877602b1da440e2bc843a3717c9ff30d9c2f0ff768a3d38a0"} Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.949945 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hf5qv"] Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.949969 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.949987 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e3aa366-1377-4a4f-b943-c0cb15cc3391","Type":"ContainerStarted","Data":"0cc0022d81473732b7a117b62fec511474864193b56cb8d000ea9be2a87c82b5"} Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.950001 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad043fd9-8b69-43b9-a155-1e395d0f4685","Type":"ContainerDied","Data":"05c623959fc0dfd300ec86ac454057decd6e955fd3375fb6906f70967f589922"} Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.950020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b97fbd7bd-9qwfz" event={"ID":"611dea12-329a-4168-9647-a5fa92453712","Type":"ContainerDied","Data":"63d31e5c124b620ce299e19c9a00872b9c462ea03ac19bfd1fead87fae1ea538"} Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.950037 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-km4xb"] Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.950050 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.950060 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f62b41ca-5e6d-4760-be62-af924b841737","Type":"ContainerStarted","Data":"ba6cd5ea4c4214f0c3ffef878583fe94e05f8dc0d7cd4b6711ee7dd59935b9a6"} Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.950071 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kz6ph" event={"ID":"ed100452-a1dc-4014-8f80-56a9ac00b198","Type":"ContainerStarted","Data":"d7729990aebe4928382575405818701880251a2e72119db5d37f3650d2cc7771"} Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.950482 4763 scope.go:117] "RemoveContainer" containerID="6e9553bed3c4aaedc82dfaaa2230a429a9f286abaa71c5cbe7ac273500f6f912" Oct 06 15:12:47 crc kubenswrapper[4763]: I1006 15:12:47.997146 4763 scope.go:117] "RemoveContainer" containerID="d745b1a7526c0318fc2e000726c3ede4adb56bf47e3b8fa345950713ab04e7ea" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.037644 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-4d6jz"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.055959 4763 scope.go:117] "RemoveContainer" containerID="003ec72ca85190eed530bd3343625c7301e38e1d324d8359cab935c10a00ca6b" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.058090 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-4d6jz"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.076927 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.083964 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.094034 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:12:48 crc kubenswrapper[4763]: E1006 15:12:48.094577 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad043fd9-8b69-43b9-a155-1e395d0f4685" containerName="glance-log" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.095783 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad043fd9-8b69-43b9-a155-1e395d0f4685" containerName="glance-log" Oct 06 15:12:48 crc kubenswrapper[4763]: E1006 15:12:48.095911 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611dea12-329a-4168-9647-a5fa92453712" containerName="neutron-httpd" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.096133 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="611dea12-329a-4168-9647-a5fa92453712" containerName="neutron-httpd" Oct 06 15:12:48 crc kubenswrapper[4763]: E1006 15:12:48.096215 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" containerName="init" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.096355 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" containerName="init" Oct 06 15:12:48 crc kubenswrapper[4763]: E1006 15:12:48.096410 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611dea12-329a-4168-9647-a5fa92453712" containerName="neutron-api" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.096457 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="611dea12-329a-4168-9647-a5fa92453712" containerName="neutron-api" Oct 06 15:12:48 crc kubenswrapper[4763]: E1006 15:12:48.096506 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae39c8a-af64-489c-bb82-27c32c21279e" containerName="cinder-scheduler" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.096554 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae39c8a-af64-489c-bb82-27c32c21279e" containerName="cinder-scheduler" Oct 06 15:12:48 crc kubenswrapper[4763]: E1006 15:12:48.096626 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" containerName="dnsmasq-dns" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.096691 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" containerName="dnsmasq-dns" Oct 06 15:12:48 crc kubenswrapper[4763]: E1006 15:12:48.096761 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae39c8a-af64-489c-bb82-27c32c21279e" containerName="probe" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.096811 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae39c8a-af64-489c-bb82-27c32c21279e" containerName="probe" Oct 06 15:12:48 crc kubenswrapper[4763]: E1006 15:12:48.096879 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad043fd9-8b69-43b9-a155-1e395d0f4685" containerName="glance-httpd" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.096927 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad043fd9-8b69-43b9-a155-1e395d0f4685" containerName="glance-httpd" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.097145 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad043fd9-8b69-43b9-a155-1e395d0f4685" containerName="glance-httpd" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.097410 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae39c8a-af64-489c-bb82-27c32c21279e" containerName="probe" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.097481 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="611dea12-329a-4168-9647-a5fa92453712" containerName="neutron-api" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.097540 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="611dea12-329a-4168-9647-a5fa92453712" containerName="neutron-httpd" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.097592 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad043fd9-8b69-43b9-a155-1e395d0f4685" containerName="glance-log" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.097665 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae39c8a-af64-489c-bb82-27c32c21279e" containerName="cinder-scheduler" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.097719 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" containerName="dnsmasq-dns" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.098707 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.100922 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.110265 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.139643 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.159008 4763 scope.go:117] "RemoveContainer" containerID="b31661be98b8240c26143e8f17e906efe34924a310dad31c6977f96d6065437a" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.166308 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.182794 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b97fbd7bd-9qwfz"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.196751 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6b97fbd7bd-9qwfz"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.207670 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.214545 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.219092 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.219502 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.235244 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-scripts\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.235337 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.235395 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.235566 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjn69\" (UniqueName: \"kubernetes.io/projected/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-kube-api-access-bjn69\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.235652 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.235728 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.240710 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.246240 4763 scope.go:117] "RemoveContainer" containerID="aca9843bff84c3cd1eed81cc10cd52d2a1fe3c06e4bcd6a0b65fd16efdfe4813" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.334962 4763 scope.go:117] "RemoveContainer" containerID="cf6dde839e85f0a737f41c91e1e0200462667c5c36ba626cfbd076b18ca61db2" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.337439 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjn69\" (UniqueName: \"kubernetes.io/projected/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-kube-api-access-bjn69\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.337498 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.337538 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rggx2\" (UniqueName: \"kubernetes.io/projected/d08ec27f-a0b7-4146-8378-8bfb3e460e05-kube-api-access-rggx2\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.337565 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.337709 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-config-data\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.337798 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-logs\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.337840 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-scripts\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.337887 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-scripts\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.337950 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.338113 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.338321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.338361 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.338403 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.338437 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.338472 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.344396 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-scripts\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.344426 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.345641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.346108 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.360594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjn69\" (UniqueName: \"kubernetes.io/projected/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-kube-api-access-bjn69\") pod \"cinder-scheduler-0\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.439740 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.439829 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.439857 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.440268 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.440382 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.440451 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rggx2\" (UniqueName: \"kubernetes.io/projected/d08ec27f-a0b7-4146-8378-8bfb3e460e05-kube-api-access-rggx2\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.440500 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-config-data\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.440525 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-logs\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.440547 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-scripts\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.440941 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.441063 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-logs\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.441464 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.444816 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.446192 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-scripts\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.448225 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-config-data\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.454263 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.460876 4763 scope.go:117] "RemoveContainer" containerID="0f3abaa43abd980ead73f222d61067f438dce0f473043b9f6581afd21b553dc8" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.461780 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rggx2\" (UniqueName: \"kubernetes.io/projected/d08ec27f-a0b7-4146-8378-8bfb3e460e05-kube-api-access-rggx2\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.480898 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.502816 4763 scope.go:117] "RemoveContainer" containerID="178342601b11f43b80ee57ac3844284165aca64d56c0cdd232573ba7de42e82b" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.539129 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.847271 4763 generic.go:334] "Generic (PLEG): container finished" podID="ff78748e-fd0a-43e5-a9c5-35daa377da26" containerID="7e2562acc0fae8c8886b551ae63328fe2b596b0aa0ed9eebdb4aafd3a62ad900" exitCode=0 Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.847804 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-km4xb" event={"ID":"ff78748e-fd0a-43e5-a9c5-35daa377da26","Type":"ContainerDied","Data":"7e2562acc0fae8c8886b551ae63328fe2b596b0aa0ed9eebdb4aafd3a62ad900"} Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.847846 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-km4xb" event={"ID":"ff78748e-fd0a-43e5-a9c5-35daa377da26","Type":"ContainerStarted","Data":"7d77439bcc9beac62c04be76a6f1e77313fb9a6cd7ba609539cd357ff2c02ab5"} Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.851050 4763 generic.go:334] "Generic (PLEG): container finished" podID="ed100452-a1dc-4014-8f80-56a9ac00b198" containerID="60c57e4673b5def98438adfcc0581b8c0827205fea234915bea281492ad7e221" exitCode=0 Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.851098 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kz6ph" event={"ID":"ed100452-a1dc-4014-8f80-56a9ac00b198","Type":"ContainerDied","Data":"60c57e4673b5def98438adfcc0581b8c0827205fea234915bea281492ad7e221"} Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.853850 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14a424ce-ef7d-4b9c-965e-b821798d3f78","Type":"ContainerStarted","Data":"aa4a777788599da63887c8bae5fa04a3d16d36136221168f67e902cc9b4ffdbc"} Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.853897 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14a424ce-ef7d-4b9c-965e-b821798d3f78","Type":"ContainerStarted","Data":"96389ebe98abf5578c37fd0a35183b2f424a733cb406605f7b723eaa55cb3668"} Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.864391 4763 generic.go:334] "Generic (PLEG): container finished" podID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerID="0cc0022d81473732b7a117b62fec511474864193b56cb8d000ea9be2a87c82b5" exitCode=0 Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.864422 4763 generic.go:334] "Generic (PLEG): container finished" podID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerID="aa6e7f6eb29514758c76df3df76faa9c7acb5e02d460158a0c98a8c23f62ff28" exitCode=2 Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.864432 4763 generic.go:334] "Generic (PLEG): container finished" podID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerID="66500ad66fcb09340039f99fbf80f972e215374b5c41fd307720621642c1de75" exitCode=0 Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.864459 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e3aa366-1377-4a4f-b943-c0cb15cc3391","Type":"ContainerDied","Data":"0cc0022d81473732b7a117b62fec511474864193b56cb8d000ea9be2a87c82b5"} Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.864511 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e3aa366-1377-4a4f-b943-c0cb15cc3391","Type":"ContainerDied","Data":"aa6e7f6eb29514758c76df3df76faa9c7acb5e02d460158a0c98a8c23f62ff28"} Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.864520 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e3aa366-1377-4a4f-b943-c0cb15cc3391","Type":"ContainerDied","Data":"66500ad66fcb09340039f99fbf80f972e215374b5c41fd307720621642c1de75"} Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.870943 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f76244e-f9eb-428e-bd75-92eb5c6204a8" containerID="4793430acd2d3747e4f90ca4aa9269c1b1570b34bc980bdff5cf2808f5651841" exitCode=0 Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.871024 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hf5qv" event={"ID":"4f76244e-f9eb-428e-bd75-92eb5c6204a8","Type":"ContainerDied","Data":"4793430acd2d3747e4f90ca4aa9269c1b1570b34bc980bdff5cf2808f5651841"} Oct 06 15:12:48 crc kubenswrapper[4763]: I1006 15:12:48.871052 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hf5qv" event={"ID":"4f76244e-f9eb-428e-bd75-92eb5c6204a8","Type":"ContainerStarted","Data":"ec588fbc6e69490374eb103ac771371c64324f736a3a7534cde6e170646a94af"} Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.007150 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:12:49 crc kubenswrapper[4763]: W1006 15:12:49.013673 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ed9fd34_3ac8_4420_958a_d4d41f7c83fa.slice/crio-043ddb8ebba5c4c9199d8e7a668f4e3a49d03de335508579b67071918c00caa6 WatchSource:0}: Error finding container 043ddb8ebba5c4c9199d8e7a668f4e3a49d03de335508579b67071918c00caa6: Status 404 returned error can't find the container with id 043ddb8ebba5c4c9199d8e7a668f4e3a49d03de335508579b67071918c00caa6 Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.277883 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.602000 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="611dea12-329a-4168-9647-a5fa92453712" path="/var/lib/kubelet/pods/611dea12-329a-4168-9647-a5fa92453712/volumes" Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.603383 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae39c8a-af64-489c-bb82-27c32c21279e" path="/var/lib/kubelet/pods/7ae39c8a-af64-489c-bb82-27c32c21279e/volumes" Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.604202 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb" path="/var/lib/kubelet/pods/8fdbe9ac-f0c5-4e41-9d18-d0951b201cbb/volumes" Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.605580 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad043fd9-8b69-43b9-a155-1e395d0f4685" path="/var/lib/kubelet/pods/ad043fd9-8b69-43b9-a155-1e395d0f4685/volumes" Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.888827 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d08ec27f-a0b7-4146-8378-8bfb3e460e05","Type":"ContainerStarted","Data":"16fb5d9200273d3f09a4302bb3ab7baef0a10a31cae4facb34c852e6d5e82428"} Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.904142 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14a424ce-ef7d-4b9c-965e-b821798d3f78","Type":"ContainerStarted","Data":"08a1f33d966b3e95b4e43e2c4c9605f94fb70073e52950d5c8d680e81f2eeaa1"} Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.918329 4763 generic.go:334] "Generic (PLEG): container finished" podID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerID="564a9b0f4e9fd4afbc67d841889eef92c3f966da86abe49d6f127aceda3fe5cd" exitCode=0 Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.918727 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e3aa366-1377-4a4f-b943-c0cb15cc3391","Type":"ContainerDied","Data":"564a9b0f4e9fd4afbc67d841889eef92c3f966da86abe49d6f127aceda3fe5cd"} Oct 06 15:12:49 crc kubenswrapper[4763]: I1006 15:12:49.944997 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa","Type":"ContainerStarted","Data":"043ddb8ebba5c4c9199d8e7a668f4e3a49d03de335508579b67071918c00caa6"} Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.193693 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.237291 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.236626252 podStartE2EDuration="10.236626252s" podCreationTimestamp="2025-10-06 15:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:49.931520634 +0000 UTC m=+1167.086813146" watchObservedRunningTime="2025-10-06 15:12:50.236626252 +0000 UTC m=+1167.391918764" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.275761 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-combined-ca-bundle\") pod \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.275945 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-config-data\") pod \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.276003 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-sg-core-conf-yaml\") pod \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.276089 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-log-httpd\") pod \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.276266 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7grh9\" (UniqueName: \"kubernetes.io/projected/4e3aa366-1377-4a4f-b943-c0cb15cc3391-kube-api-access-7grh9\") pod \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.276432 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-run-httpd\") pod \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.276511 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-scripts\") pod \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\" (UID: \"4e3aa366-1377-4a4f-b943-c0cb15cc3391\") " Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.283549 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4e3aa366-1377-4a4f-b943-c0cb15cc3391" (UID: "4e3aa366-1377-4a4f-b943-c0cb15cc3391"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.283937 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4e3aa366-1377-4a4f-b943-c0cb15cc3391" (UID: "4e3aa366-1377-4a4f-b943-c0cb15cc3391"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.288758 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-scripts" (OuterVolumeSpecName: "scripts") pod "4e3aa366-1377-4a4f-b943-c0cb15cc3391" (UID: "4e3aa366-1377-4a4f-b943-c0cb15cc3391"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.290441 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e3aa366-1377-4a4f-b943-c0cb15cc3391-kube-api-access-7grh9" (OuterVolumeSpecName: "kube-api-access-7grh9") pod "4e3aa366-1377-4a4f-b943-c0cb15cc3391" (UID: "4e3aa366-1377-4a4f-b943-c0cb15cc3391"). InnerVolumeSpecName "kube-api-access-7grh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.306660 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4e3aa366-1377-4a4f-b943-c0cb15cc3391" (UID: "4e3aa366-1377-4a4f-b943-c0cb15cc3391"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.372745 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e3aa366-1377-4a4f-b943-c0cb15cc3391" (UID: "4e3aa366-1377-4a4f-b943-c0cb15cc3391"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.380831 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.380882 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.380896 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.380910 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.380921 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e3aa366-1377-4a4f-b943-c0cb15cc3391-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.380932 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7grh9\" (UniqueName: \"kubernetes.io/projected/4e3aa366-1377-4a4f-b943-c0cb15cc3391-kube-api-access-7grh9\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.414083 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hf5qv" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.458718 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-config-data" (OuterVolumeSpecName: "config-data") pod "4e3aa366-1377-4a4f-b943-c0cb15cc3391" (UID: "4e3aa366-1377-4a4f-b943-c0cb15cc3391"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.482552 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bnv6\" (UniqueName: \"kubernetes.io/projected/4f76244e-f9eb-428e-bd75-92eb5c6204a8-kube-api-access-2bnv6\") pod \"4f76244e-f9eb-428e-bd75-92eb5c6204a8\" (UID: \"4f76244e-f9eb-428e-bd75-92eb5c6204a8\") " Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.483007 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3aa366-1377-4a4f-b943-c0cb15cc3391-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.488454 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f76244e-f9eb-428e-bd75-92eb5c6204a8-kube-api-access-2bnv6" (OuterVolumeSpecName: "kube-api-access-2bnv6") pod "4f76244e-f9eb-428e-bd75-92eb5c6204a8" (UID: "4f76244e-f9eb-428e-bd75-92eb5c6204a8"). InnerVolumeSpecName "kube-api-access-2bnv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.551691 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.585254 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bnv6\" (UniqueName: \"kubernetes.io/projected/4f76244e-f9eb-428e-bd75-92eb5c6204a8-kube-api-access-2bnv6\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.637262 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-km4xb" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.670176 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kz6ph" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.689793 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkhtz\" (UniqueName: \"kubernetes.io/projected/ff78748e-fd0a-43e5-a9c5-35daa377da26-kube-api-access-lkhtz\") pod \"ff78748e-fd0a-43e5-a9c5-35daa377da26\" (UID: \"ff78748e-fd0a-43e5-a9c5-35daa377da26\") " Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.694436 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff78748e-fd0a-43e5-a9c5-35daa377da26-kube-api-access-lkhtz" (OuterVolumeSpecName: "kube-api-access-lkhtz") pod "ff78748e-fd0a-43e5-a9c5-35daa377da26" (UID: "ff78748e-fd0a-43e5-a9c5-35daa377da26"). InnerVolumeSpecName "kube-api-access-lkhtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.793964 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp577\" (UniqueName: \"kubernetes.io/projected/ed100452-a1dc-4014-8f80-56a9ac00b198-kube-api-access-lp577\") pod \"ed100452-a1dc-4014-8f80-56a9ac00b198\" (UID: \"ed100452-a1dc-4014-8f80-56a9ac00b198\") " Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.794701 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkhtz\" (UniqueName: \"kubernetes.io/projected/ff78748e-fd0a-43e5-a9c5-35daa377da26-kube-api-access-lkhtz\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.812067 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed100452-a1dc-4014-8f80-56a9ac00b198-kube-api-access-lp577" (OuterVolumeSpecName: "kube-api-access-lp577") pod "ed100452-a1dc-4014-8f80-56a9ac00b198" (UID: "ed100452-a1dc-4014-8f80-56a9ac00b198"). InnerVolumeSpecName "kube-api-access-lp577". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.896942 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp577\" (UniqueName: \"kubernetes.io/projected/ed100452-a1dc-4014-8f80-56a9ac00b198-kube-api-access-lp577\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.991541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kz6ph" event={"ID":"ed100452-a1dc-4014-8f80-56a9ac00b198","Type":"ContainerDied","Data":"d7729990aebe4928382575405818701880251a2e72119db5d37f3650d2cc7771"} Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.991577 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7729990aebe4928382575405818701880251a2e72119db5d37f3650d2cc7771" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.991657 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kz6ph" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.998247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-km4xb" event={"ID":"ff78748e-fd0a-43e5-a9c5-35daa377da26","Type":"ContainerDied","Data":"7d77439bcc9beac62c04be76a6f1e77313fb9a6cd7ba609539cd357ff2c02ab5"} Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.998287 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d77439bcc9beac62c04be76a6f1e77313fb9a6cd7ba609539cd357ff2c02ab5" Oct 06 15:12:50 crc kubenswrapper[4763]: I1006 15:12:50.998365 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-km4xb" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.004071 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e3aa366-1377-4a4f-b943-c0cb15cc3391","Type":"ContainerDied","Data":"1bfb2c7e3dc4958e5da11c7e4360d7211a220cfa1d99be7d9f120b0fa67fa775"} Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.004116 4763 scope.go:117] "RemoveContainer" containerID="0cc0022d81473732b7a117b62fec511474864193b56cb8d000ea9be2a87c82b5" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.004243 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.011354 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hf5qv" event={"ID":"4f76244e-f9eb-428e-bd75-92eb5c6204a8","Type":"ContainerDied","Data":"ec588fbc6e69490374eb103ac771371c64324f736a3a7534cde6e170646a94af"} Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.011385 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec588fbc6e69490374eb103ac771371c64324f736a3a7534cde6e170646a94af" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.011433 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hf5qv" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.024815 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa","Type":"ContainerStarted","Data":"8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428"} Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.024858 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa","Type":"ContainerStarted","Data":"23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50"} Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.028789 4763 scope.go:117] "RemoveContainer" containerID="aa6e7f6eb29514758c76df3df76faa9c7acb5e02d460158a0c98a8c23f62ff28" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.031207 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d08ec27f-a0b7-4146-8378-8bfb3e460e05","Type":"ContainerStarted","Data":"8f6a767bdad431305d84e26950c1846ebe85fc9d498b6fa5277ed34f9be13eb0"} Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.050091 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.0500779 podStartE2EDuration="3.0500779s" podCreationTimestamp="2025-10-06 15:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:51.048572109 +0000 UTC m=+1168.203864631" watchObservedRunningTime="2025-10-06 15:12:51.0500779 +0000 UTC m=+1168.205370412" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.072108 4763 scope.go:117] "RemoveContainer" containerID="564a9b0f4e9fd4afbc67d841889eef92c3f966da86abe49d6f127aceda3fe5cd" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.080955 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.089604 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098016 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:51 crc kubenswrapper[4763]: E1006 15:12:51.098443 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff78748e-fd0a-43e5-a9c5-35daa377da26" containerName="mariadb-database-create" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098461 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff78748e-fd0a-43e5-a9c5-35daa377da26" containerName="mariadb-database-create" Oct 06 15:12:51 crc kubenswrapper[4763]: E1006 15:12:51.098482 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed100452-a1dc-4014-8f80-56a9ac00b198" containerName="mariadb-database-create" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098488 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed100452-a1dc-4014-8f80-56a9ac00b198" containerName="mariadb-database-create" Oct 06 15:12:51 crc kubenswrapper[4763]: E1006 15:12:51.098498 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="ceilometer-central-agent" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098504 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="ceilometer-central-agent" Oct 06 15:12:51 crc kubenswrapper[4763]: E1006 15:12:51.098515 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="sg-core" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098520 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="sg-core" Oct 06 15:12:51 crc kubenswrapper[4763]: E1006 15:12:51.098538 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="proxy-httpd" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098546 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="proxy-httpd" Oct 06 15:12:51 crc kubenswrapper[4763]: E1006 15:12:51.098557 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="ceilometer-notification-agent" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098564 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="ceilometer-notification-agent" Oct 06 15:12:51 crc kubenswrapper[4763]: E1006 15:12:51.098576 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f76244e-f9eb-428e-bd75-92eb5c6204a8" containerName="mariadb-database-create" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098582 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f76244e-f9eb-428e-bd75-92eb5c6204a8" containerName="mariadb-database-create" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098840 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="sg-core" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098850 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="ceilometer-central-agent" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098857 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="proxy-httpd" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098873 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff78748e-fd0a-43e5-a9c5-35daa377da26" containerName="mariadb-database-create" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098884 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed100452-a1dc-4014-8f80-56a9ac00b198" containerName="mariadb-database-create" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098899 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f76244e-f9eb-428e-bd75-92eb5c6204a8" containerName="mariadb-database-create" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.098909 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" containerName="ceilometer-notification-agent" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.100452 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.102931 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.103691 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.112195 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.131981 4763 scope.go:117] "RemoveContainer" containerID="66500ad66fcb09340039f99fbf80f972e215374b5c41fd307720621642c1de75" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.202460 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-run-httpd\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.202516 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-log-httpd\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.202554 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-config-data\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.202598 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-scripts\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.202642 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.202741 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.202780 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj2zp\" (UniqueName: \"kubernetes.io/projected/8a51ba6d-6514-4487-bd6c-60f8f728c91e-kube-api-access-vj2zp\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.259127 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.259212 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.289519 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.304532 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-scripts\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.304600 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.304654 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.304704 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj2zp\" (UniqueName: \"kubernetes.io/projected/8a51ba6d-6514-4487-bd6c-60f8f728c91e-kube-api-access-vj2zp\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.304773 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-run-httpd\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.304823 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-log-httpd\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.304864 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-config-data\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.307825 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-run-httpd\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.308359 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-log-httpd\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.317181 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.318036 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.320464 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-config-data\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.323673 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.332322 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-scripts\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.334133 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj2zp\" (UniqueName: \"kubernetes.io/projected/8a51ba6d-6514-4487-bd6c-60f8f728c91e-kube-api-access-vj2zp\") pod \"ceilometer-0\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.418498 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.620544 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e3aa366-1377-4a4f-b943-c0cb15cc3391" path="/var/lib/kubelet/pods/4e3aa366-1377-4a4f-b943-c0cb15cc3391/volumes" Oct 06 15:12:51 crc kubenswrapper[4763]: I1006 15:12:51.950396 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:52 crc kubenswrapper[4763]: I1006 15:12:52.041520 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d08ec27f-a0b7-4146-8378-8bfb3e460e05","Type":"ContainerStarted","Data":"d1eb8ece3ed05071e8f0e183f59d758fa3e0c6899e555329438a62699f5c4813"} Oct 06 15:12:52 crc kubenswrapper[4763]: I1006 15:12:52.042677 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a51ba6d-6514-4487-bd6c-60f8f728c91e","Type":"ContainerStarted","Data":"1d08fcf8ddd5b949c0c6439471321ef1e1158c40e9fef98fbd501a165e66465d"} Oct 06 15:12:52 crc kubenswrapper[4763]: I1006 15:12:52.042837 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:52 crc kubenswrapper[4763]: I1006 15:12:52.042876 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:52 crc kubenswrapper[4763]: I1006 15:12:52.063898 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.06387794 podStartE2EDuration="4.06387794s" podCreationTimestamp="2025-10-06 15:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:12:52.060324652 +0000 UTC m=+1169.215617194" watchObservedRunningTime="2025-10-06 15:12:52.06387794 +0000 UTC m=+1169.219170452" Oct 06 15:12:53 crc kubenswrapper[4763]: I1006 15:12:53.051512 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a51ba6d-6514-4487-bd6c-60f8f728c91e","Type":"ContainerStarted","Data":"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61"} Oct 06 15:12:53 crc kubenswrapper[4763]: I1006 15:12:53.442694 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 15:12:54 crc kubenswrapper[4763]: I1006 15:12:54.061967 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a51ba6d-6514-4487-bd6c-60f8f728c91e","Type":"ContainerStarted","Data":"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825"} Oct 06 15:12:54 crc kubenswrapper[4763]: I1006 15:12:54.550076 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3e42-account-create-fwrfq"] Oct 06 15:12:54 crc kubenswrapper[4763]: I1006 15:12:54.551598 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3e42-account-create-fwrfq" Oct 06 15:12:54 crc kubenswrapper[4763]: I1006 15:12:54.553969 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 06 15:12:54 crc kubenswrapper[4763]: I1006 15:12:54.582844 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3e42-account-create-fwrfq"] Oct 06 15:12:54 crc kubenswrapper[4763]: I1006 15:12:54.660770 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hqcr\" (UniqueName: \"kubernetes.io/projected/c1727578-835e-40a5-b9f2-84fbfe40a92f-kube-api-access-9hqcr\") pod \"nova-api-3e42-account-create-fwrfq\" (UID: \"c1727578-835e-40a5-b9f2-84fbfe40a92f\") " pod="openstack/nova-api-3e42-account-create-fwrfq" Oct 06 15:12:54 crc kubenswrapper[4763]: I1006 15:12:54.762686 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hqcr\" (UniqueName: \"kubernetes.io/projected/c1727578-835e-40a5-b9f2-84fbfe40a92f-kube-api-access-9hqcr\") pod \"nova-api-3e42-account-create-fwrfq\" (UID: \"c1727578-835e-40a5-b9f2-84fbfe40a92f\") " pod="openstack/nova-api-3e42-account-create-fwrfq" Oct 06 15:12:54 crc kubenswrapper[4763]: I1006 15:12:54.781968 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hqcr\" (UniqueName: \"kubernetes.io/projected/c1727578-835e-40a5-b9f2-84fbfe40a92f-kube-api-access-9hqcr\") pod \"nova-api-3e42-account-create-fwrfq\" (UID: \"c1727578-835e-40a5-b9f2-84fbfe40a92f\") " pod="openstack/nova-api-3e42-account-create-fwrfq" Oct 06 15:12:54 crc kubenswrapper[4763]: I1006 15:12:54.868484 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3e42-account-create-fwrfq" Oct 06 15:12:55 crc kubenswrapper[4763]: I1006 15:12:55.012707 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:55 crc kubenswrapper[4763]: I1006 15:12:55.083106 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a51ba6d-6514-4487-bd6c-60f8f728c91e","Type":"ContainerStarted","Data":"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351"} Oct 06 15:12:55 crc kubenswrapper[4763]: I1006 15:12:55.379353 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3e42-account-create-fwrfq"] Oct 06 15:12:56 crc kubenswrapper[4763]: I1006 15:12:56.096362 4763 generic.go:334] "Generic (PLEG): container finished" podID="c1727578-835e-40a5-b9f2-84fbfe40a92f" containerID="407915862e4e0141f6174c767f9b9f07b89723636e5be3b23e2f2c38957637a1" exitCode=0 Oct 06 15:12:56 crc kubenswrapper[4763]: I1006 15:12:56.096575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3e42-account-create-fwrfq" event={"ID":"c1727578-835e-40a5-b9f2-84fbfe40a92f","Type":"ContainerDied","Data":"407915862e4e0141f6174c767f9b9f07b89723636e5be3b23e2f2c38957637a1"} Oct 06 15:12:56 crc kubenswrapper[4763]: I1006 15:12:56.096796 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3e42-account-create-fwrfq" event={"ID":"c1727578-835e-40a5-b9f2-84fbfe40a92f","Type":"ContainerStarted","Data":"6b133e2b798ee2b3b0d40b1efe615dbbf5505d7c33969d15359970c9a5beddee"} Oct 06 15:12:57 crc kubenswrapper[4763]: I1006 15:12:57.107771 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a51ba6d-6514-4487-bd6c-60f8f728c91e","Type":"ContainerStarted","Data":"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc"} Oct 06 15:12:57 crc kubenswrapper[4763]: I1006 15:12:57.108059 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 15:12:57 crc kubenswrapper[4763]: I1006 15:12:57.137202 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.147739839 podStartE2EDuration="6.137179793s" podCreationTimestamp="2025-10-06 15:12:51 +0000 UTC" firstStartedPulling="2025-10-06 15:12:51.988454371 +0000 UTC m=+1169.143746883" lastFinishedPulling="2025-10-06 15:12:55.977894315 +0000 UTC m=+1173.133186837" observedRunningTime="2025-10-06 15:12:57.127497446 +0000 UTC m=+1174.282789948" watchObservedRunningTime="2025-10-06 15:12:57.137179793 +0000 UTC m=+1174.292472315" Oct 06 15:12:57 crc kubenswrapper[4763]: I1006 15:12:57.509851 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3e42-account-create-fwrfq" Oct 06 15:12:57 crc kubenswrapper[4763]: I1006 15:12:57.548190 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 15:12:57 crc kubenswrapper[4763]: I1006 15:12:57.613071 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hqcr\" (UniqueName: \"kubernetes.io/projected/c1727578-835e-40a5-b9f2-84fbfe40a92f-kube-api-access-9hqcr\") pod \"c1727578-835e-40a5-b9f2-84fbfe40a92f\" (UID: \"c1727578-835e-40a5-b9f2-84fbfe40a92f\") " Oct 06 15:12:57 crc kubenswrapper[4763]: I1006 15:12:57.618727 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1727578-835e-40a5-b9f2-84fbfe40a92f-kube-api-access-9hqcr" (OuterVolumeSpecName: "kube-api-access-9hqcr") pod "c1727578-835e-40a5-b9f2-84fbfe40a92f" (UID: "c1727578-835e-40a5-b9f2-84fbfe40a92f"). InnerVolumeSpecName "kube-api-access-9hqcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:12:57 crc kubenswrapper[4763]: I1006 15:12:57.716495 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hqcr\" (UniqueName: \"kubernetes.io/projected/c1727578-835e-40a5-b9f2-84fbfe40a92f-kube-api-access-9hqcr\") on node \"crc\" DevicePath \"\"" Oct 06 15:12:58 crc kubenswrapper[4763]: I1006 15:12:58.117348 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3e42-account-create-fwrfq" Oct 06 15:12:58 crc kubenswrapper[4763]: I1006 15:12:58.117875 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3e42-account-create-fwrfq" event={"ID":"c1727578-835e-40a5-b9f2-84fbfe40a92f","Type":"ContainerDied","Data":"6b133e2b798ee2b3b0d40b1efe615dbbf5505d7c33969d15359970c9a5beddee"} Oct 06 15:12:58 crc kubenswrapper[4763]: I1006 15:12:58.117901 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b133e2b798ee2b3b0d40b1efe615dbbf5505d7c33969d15359970c9a5beddee" Oct 06 15:12:58 crc kubenswrapper[4763]: I1006 15:12:58.544919 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 15:12:58 crc kubenswrapper[4763]: I1006 15:12:58.544976 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 15:12:58 crc kubenswrapper[4763]: I1006 15:12:58.585382 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 15:12:58 crc kubenswrapper[4763]: I1006 15:12:58.587303 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:12:58 crc kubenswrapper[4763]: I1006 15:12:58.610979 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 15:12:58 crc kubenswrapper[4763]: I1006 15:12:58.718678 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 15:12:59 crc kubenswrapper[4763]: I1006 15:12:59.125317 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="ceilometer-central-agent" containerID="cri-o://702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61" gracePeriod=30 Oct 06 15:12:59 crc kubenswrapper[4763]: I1006 15:12:59.125420 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="sg-core" containerID="cri-o://a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351" gracePeriod=30 Oct 06 15:12:59 crc kubenswrapper[4763]: I1006 15:12:59.125476 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="proxy-httpd" containerID="cri-o://de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc" gracePeriod=30 Oct 06 15:12:59 crc kubenswrapper[4763]: I1006 15:12:59.125653 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="ceilometer-notification-agent" containerID="cri-o://2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825" gracePeriod=30 Oct 06 15:12:59 crc kubenswrapper[4763]: I1006 15:12:59.125838 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 15:12:59 crc kubenswrapper[4763]: I1006 15:12:59.125859 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 15:12:59 crc kubenswrapper[4763]: I1006 15:12:59.929522 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.060307 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-combined-ca-bundle\") pod \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.060424 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-sg-core-conf-yaml\") pod \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.060459 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj2zp\" (UniqueName: \"kubernetes.io/projected/8a51ba6d-6514-4487-bd6c-60f8f728c91e-kube-api-access-vj2zp\") pod \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.060501 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-config-data\") pod \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.060526 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-scripts\") pod \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.060597 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-log-httpd\") pod \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.060658 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-run-httpd\") pod \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\" (UID: \"8a51ba6d-6514-4487-bd6c-60f8f728c91e\") " Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.061081 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a51ba6d-6514-4487-bd6c-60f8f728c91e" (UID: "8a51ba6d-6514-4487-bd6c-60f8f728c91e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.061210 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a51ba6d-6514-4487-bd6c-60f8f728c91e" (UID: "8a51ba6d-6514-4487-bd6c-60f8f728c91e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.061789 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.061818 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a51ba6d-6514-4487-bd6c-60f8f728c91e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.084024 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a51ba6d-6514-4487-bd6c-60f8f728c91e-kube-api-access-vj2zp" (OuterVolumeSpecName: "kube-api-access-vj2zp") pod "8a51ba6d-6514-4487-bd6c-60f8f728c91e" (UID: "8a51ba6d-6514-4487-bd6c-60f8f728c91e"). InnerVolumeSpecName "kube-api-access-vj2zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.084188 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-scripts" (OuterVolumeSpecName: "scripts") pod "8a51ba6d-6514-4487-bd6c-60f8f728c91e" (UID: "8a51ba6d-6514-4487-bd6c-60f8f728c91e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.118726 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a51ba6d-6514-4487-bd6c-60f8f728c91e" (UID: "8a51ba6d-6514-4487-bd6c-60f8f728c91e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135636 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerID="de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc" exitCode=0 Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135677 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerID="a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351" exitCode=2 Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135686 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerID="2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825" exitCode=0 Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135694 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerID="702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61" exitCode=0 Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135826 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a51ba6d-6514-4487-bd6c-60f8f728c91e","Type":"ContainerDied","Data":"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc"} Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135865 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135890 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a51ba6d-6514-4487-bd6c-60f8f728c91e","Type":"ContainerDied","Data":"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351"} Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135902 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a51ba6d-6514-4487-bd6c-60f8f728c91e","Type":"ContainerDied","Data":"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825"} Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a51ba6d-6514-4487-bd6c-60f8f728c91e","Type":"ContainerDied","Data":"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61"} Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135920 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a51ba6d-6514-4487-bd6c-60f8f728c91e","Type":"ContainerDied","Data":"1d08fcf8ddd5b949c0c6439471321ef1e1158c40e9fef98fbd501a165e66465d"} Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.135938 4763 scope.go:117] "RemoveContainer" containerID="de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.163157 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.163457 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj2zp\" (UniqueName: \"kubernetes.io/projected/8a51ba6d-6514-4487-bd6c-60f8f728c91e-kube-api-access-vj2zp\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.163469 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.163878 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a51ba6d-6514-4487-bd6c-60f8f728c91e" (UID: "8a51ba6d-6514-4487-bd6c-60f8f728c91e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.172709 4763 scope.go:117] "RemoveContainer" containerID="a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.188460 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-config-data" (OuterVolumeSpecName: "config-data") pod "8a51ba6d-6514-4487-bd6c-60f8f728c91e" (UID: "8a51ba6d-6514-4487-bd6c-60f8f728c91e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.192252 4763 scope.go:117] "RemoveContainer" containerID="2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.224456 4763 scope.go:117] "RemoveContainer" containerID="702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.259600 4763 scope.go:117] "RemoveContainer" containerID="de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc" Oct 06 15:13:00 crc kubenswrapper[4763]: E1006 15:13:00.260209 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc\": container with ID starting with de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc not found: ID does not exist" containerID="de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.260267 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc"} err="failed to get container status \"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc\": rpc error: code = NotFound desc = could not find container \"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc\": container with ID starting with de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.260303 4763 scope.go:117] "RemoveContainer" containerID="a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351" Oct 06 15:13:00 crc kubenswrapper[4763]: E1006 15:13:00.260839 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351\": container with ID starting with a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351 not found: ID does not exist" containerID="a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.260888 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351"} err="failed to get container status \"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351\": rpc error: code = NotFound desc = could not find container \"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351\": container with ID starting with a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.260917 4763 scope.go:117] "RemoveContainer" containerID="2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825" Oct 06 15:13:00 crc kubenswrapper[4763]: E1006 15:13:00.261238 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825\": container with ID starting with 2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825 not found: ID does not exist" containerID="2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.261266 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825"} err="failed to get container status \"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825\": rpc error: code = NotFound desc = could not find container \"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825\": container with ID starting with 2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.261284 4763 scope.go:117] "RemoveContainer" containerID="702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61" Oct 06 15:13:00 crc kubenswrapper[4763]: E1006 15:13:00.261598 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61\": container with ID starting with 702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61 not found: ID does not exist" containerID="702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.261645 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61"} err="failed to get container status \"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61\": rpc error: code = NotFound desc = could not find container \"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61\": container with ID starting with 702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.261664 4763 scope.go:117] "RemoveContainer" containerID="de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.261974 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc"} err="failed to get container status \"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc\": rpc error: code = NotFound desc = could not find container \"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc\": container with ID starting with de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.262020 4763 scope.go:117] "RemoveContainer" containerID="a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.262502 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351"} err="failed to get container status \"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351\": rpc error: code = NotFound desc = could not find container \"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351\": container with ID starting with a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.262533 4763 scope.go:117] "RemoveContainer" containerID="2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.262930 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825"} err="failed to get container status \"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825\": rpc error: code = NotFound desc = could not find container \"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825\": container with ID starting with 2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.262962 4763 scope.go:117] "RemoveContainer" containerID="702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.263230 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61"} err="failed to get container status \"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61\": rpc error: code = NotFound desc = could not find container \"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61\": container with ID starting with 702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.263263 4763 scope.go:117] "RemoveContainer" containerID="de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.263585 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc"} err="failed to get container status \"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc\": rpc error: code = NotFound desc = could not find container \"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc\": container with ID starting with de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.263637 4763 scope.go:117] "RemoveContainer" containerID="a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.263883 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351"} err="failed to get container status \"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351\": rpc error: code = NotFound desc = could not find container \"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351\": container with ID starting with a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.263912 4763 scope.go:117] "RemoveContainer" containerID="2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.264137 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825"} err="failed to get container status \"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825\": rpc error: code = NotFound desc = could not find container \"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825\": container with ID starting with 2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.264167 4763 scope.go:117] "RemoveContainer" containerID="702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.264410 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61"} err="failed to get container status \"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61\": rpc error: code = NotFound desc = could not find container \"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61\": container with ID starting with 702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.264440 4763 scope.go:117] "RemoveContainer" containerID="de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.264703 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc"} err="failed to get container status \"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc\": rpc error: code = NotFound desc = could not find container \"de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc\": container with ID starting with de01f18bc94c93e28492c5f0967ca5a0fe43008bb61768f487aa448b8f2285dc not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.264732 4763 scope.go:117] "RemoveContainer" containerID="a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.264895 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.264922 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a51ba6d-6514-4487-bd6c-60f8f728c91e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.265583 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351"} err="failed to get container status \"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351\": rpc error: code = NotFound desc = could not find container \"a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351\": container with ID starting with a7e02baa28c1e9650bdfc85b152bd916bfd299c8c549d51ff2cd0b422a8e2351 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.265628 4763 scope.go:117] "RemoveContainer" containerID="2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.265930 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825"} err="failed to get container status \"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825\": rpc error: code = NotFound desc = could not find container \"2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825\": container with ID starting with 2ee6bba3c76b89ff7776dab548f3b866a7e0f6f64d45d54654d1dc64d3199825 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.265960 4763 scope.go:117] "RemoveContainer" containerID="702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.266213 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61"} err="failed to get container status \"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61\": rpc error: code = NotFound desc = could not find container \"702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61\": container with ID starting with 702c2e2460ab4a2b32d97972af62f7e1d39b2216aa22f9f4952346d89e83aa61 not found: ID does not exist" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.477335 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.494838 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506177 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:00 crc kubenswrapper[4763]: E1006 15:13:00.506531 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="ceilometer-notification-agent" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506550 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="ceilometer-notification-agent" Oct 06 15:13:00 crc kubenswrapper[4763]: E1006 15:13:00.506583 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1727578-835e-40a5-b9f2-84fbfe40a92f" containerName="mariadb-account-create" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506592 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1727578-835e-40a5-b9f2-84fbfe40a92f" containerName="mariadb-account-create" Oct 06 15:13:00 crc kubenswrapper[4763]: E1006 15:13:00.506610 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="sg-core" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506637 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="sg-core" Oct 06 15:13:00 crc kubenswrapper[4763]: E1006 15:13:00.506655 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="proxy-httpd" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506661 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="proxy-httpd" Oct 06 15:13:00 crc kubenswrapper[4763]: E1006 15:13:00.506672 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="ceilometer-central-agent" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506677 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="ceilometer-central-agent" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506837 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="ceilometer-notification-agent" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506847 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="ceilometer-central-agent" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506857 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="proxy-httpd" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506871 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1727578-835e-40a5-b9f2-84fbfe40a92f" containerName="mariadb-account-create" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.506886 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" containerName="sg-core" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.508546 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.511727 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.511921 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.516832 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.570728 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-log-httpd\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.570788 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6mc5\" (UniqueName: \"kubernetes.io/projected/03256f54-78cd-480c-9b33-36030e821372-kube-api-access-b6mc5\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.570829 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-config-data\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.570847 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-scripts\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.571113 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.571162 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-run-httpd\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.571355 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.673101 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6mc5\" (UniqueName: \"kubernetes.io/projected/03256f54-78cd-480c-9b33-36030e821372-kube-api-access-b6mc5\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.673170 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-config-data\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.673192 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-scripts\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.673267 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.673290 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-run-httpd\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.673324 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.673430 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-log-httpd\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.673868 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-log-httpd\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.674200 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-run-httpd\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.676986 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-scripts\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.677579 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-config-data\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.678126 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.686595 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.691114 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6mc5\" (UniqueName: \"kubernetes.io/projected/03256f54-78cd-480c-9b33-36030e821372-kube-api-access-b6mc5\") pod \"ceilometer-0\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " pod="openstack/ceilometer-0" Oct 06 15:13:00 crc kubenswrapper[4763]: I1006 15:13:00.848477 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:01 crc kubenswrapper[4763]: I1006 15:13:01.205147 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:01 crc kubenswrapper[4763]: I1006 15:13:01.324839 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:01 crc kubenswrapper[4763]: I1006 15:13:01.587843 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a51ba6d-6514-4487-bd6c-60f8f728c91e" path="/var/lib/kubelet/pods/8a51ba6d-6514-4487-bd6c-60f8f728c91e/volumes" Oct 06 15:13:01 crc kubenswrapper[4763]: I1006 15:13:01.604732 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 15:13:01 crc kubenswrapper[4763]: I1006 15:13:01.605140 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 15:13:01 crc kubenswrapper[4763]: I1006 15:13:01.625521 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 15:13:02 crc kubenswrapper[4763]: I1006 15:13:02.159070 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03256f54-78cd-480c-9b33-36030e821372","Type":"ContainerStarted","Data":"a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32"} Oct 06 15:13:02 crc kubenswrapper[4763]: I1006 15:13:02.159332 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03256f54-78cd-480c-9b33-36030e821372","Type":"ContainerStarted","Data":"72ec964cf7db221aceab66a05beb6eab9b180d0e2e1002dc98bec1e2bc72d2b9"} Oct 06 15:13:03 crc kubenswrapper[4763]: I1006 15:13:03.168313 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03256f54-78cd-480c-9b33-36030e821372","Type":"ContainerStarted","Data":"68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb"} Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.182604 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03256f54-78cd-480c-9b33-36030e821372","Type":"ContainerStarted","Data":"efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b"} Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.687389 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ee6c-account-create-mk2rc"] Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.689266 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee6c-account-create-mk2rc" Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.691506 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.706292 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ee6c-account-create-mk2rc"] Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.754043 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsk8z\" (UniqueName: \"kubernetes.io/projected/d01055e0-1ab7-4f74-89b5-1227aa259394-kube-api-access-nsk8z\") pod \"nova-cell0-ee6c-account-create-mk2rc\" (UID: \"d01055e0-1ab7-4f74-89b5-1227aa259394\") " pod="openstack/nova-cell0-ee6c-account-create-mk2rc" Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.856736 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsk8z\" (UniqueName: \"kubernetes.io/projected/d01055e0-1ab7-4f74-89b5-1227aa259394-kube-api-access-nsk8z\") pod \"nova-cell0-ee6c-account-create-mk2rc\" (UID: \"d01055e0-1ab7-4f74-89b5-1227aa259394\") " pod="openstack/nova-cell0-ee6c-account-create-mk2rc" Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.872099 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ca15-account-create-x7kw9"] Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.873554 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ca15-account-create-x7kw9" Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.876447 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.880429 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsk8z\" (UniqueName: \"kubernetes.io/projected/d01055e0-1ab7-4f74-89b5-1227aa259394-kube-api-access-nsk8z\") pod \"nova-cell0-ee6c-account-create-mk2rc\" (UID: \"d01055e0-1ab7-4f74-89b5-1227aa259394\") " pod="openstack/nova-cell0-ee6c-account-create-mk2rc" Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.883449 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ca15-account-create-x7kw9"] Oct 06 15:13:04 crc kubenswrapper[4763]: I1006 15:13:04.958194 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j299\" (UniqueName: \"kubernetes.io/projected/8b629946-f324-44b2-a40f-fcef64ee4766-kube-api-access-6j299\") pod \"nova-cell1-ca15-account-create-x7kw9\" (UID: \"8b629946-f324-44b2-a40f-fcef64ee4766\") " pod="openstack/nova-cell1-ca15-account-create-x7kw9" Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.008805 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee6c-account-create-mk2rc" Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.059912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j299\" (UniqueName: \"kubernetes.io/projected/8b629946-f324-44b2-a40f-fcef64ee4766-kube-api-access-6j299\") pod \"nova-cell1-ca15-account-create-x7kw9\" (UID: \"8b629946-f324-44b2-a40f-fcef64ee4766\") " pod="openstack/nova-cell1-ca15-account-create-x7kw9" Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.082268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j299\" (UniqueName: \"kubernetes.io/projected/8b629946-f324-44b2-a40f-fcef64ee4766-kube-api-access-6j299\") pod \"nova-cell1-ca15-account-create-x7kw9\" (UID: \"8b629946-f324-44b2-a40f-fcef64ee4766\") " pod="openstack/nova-cell1-ca15-account-create-x7kw9" Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.202354 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03256f54-78cd-480c-9b33-36030e821372","Type":"ContainerStarted","Data":"f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76"} Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.202861 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="ceilometer-central-agent" containerID="cri-o://a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32" gracePeriod=30 Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.202977 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="proxy-httpd" containerID="cri-o://f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76" gracePeriod=30 Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.203063 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="ceilometer-notification-agent" containerID="cri-o://68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb" gracePeriod=30 Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.202977 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.203176 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="sg-core" containerID="cri-o://efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b" gracePeriod=30 Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.233113 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.847893354 podStartE2EDuration="5.233096656s" podCreationTimestamp="2025-10-06 15:13:00 +0000 UTC" firstStartedPulling="2025-10-06 15:13:01.33538915 +0000 UTC m=+1178.490681662" lastFinishedPulling="2025-10-06 15:13:04.720592452 +0000 UTC m=+1181.875884964" observedRunningTime="2025-10-06 15:13:05.230478914 +0000 UTC m=+1182.385771426" watchObservedRunningTime="2025-10-06 15:13:05.233096656 +0000 UTC m=+1182.388389168" Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.275768 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ca15-account-create-x7kw9" Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.525957 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ee6c-account-create-mk2rc"] Oct 06 15:13:05 crc kubenswrapper[4763]: W1006 15:13:05.534513 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd01055e0_1ab7_4f74_89b5_1227aa259394.slice/crio-baa641191f649facaa61fcdb113a6f8be4f7353eb3df018558020610710d8908 WatchSource:0}: Error finding container baa641191f649facaa61fcdb113a6f8be4f7353eb3df018558020610710d8908: Status 404 returned error can't find the container with id baa641191f649facaa61fcdb113a6f8be4f7353eb3df018558020610710d8908 Oct 06 15:13:05 crc kubenswrapper[4763]: I1006 15:13:05.703085 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ca15-account-create-x7kw9"] Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.212145 4763 generic.go:334] "Generic (PLEG): container finished" podID="d01055e0-1ab7-4f74-89b5-1227aa259394" containerID="0cb10096907fb2a5c3d74e1bc6aad8f1891f33bcd0cf07fae66a4c4f6c4f002c" exitCode=0 Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.212235 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee6c-account-create-mk2rc" event={"ID":"d01055e0-1ab7-4f74-89b5-1227aa259394","Type":"ContainerDied","Data":"0cb10096907fb2a5c3d74e1bc6aad8f1891f33bcd0cf07fae66a4c4f6c4f002c"} Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.212533 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee6c-account-create-mk2rc" event={"ID":"d01055e0-1ab7-4f74-89b5-1227aa259394","Type":"ContainerStarted","Data":"baa641191f649facaa61fcdb113a6f8be4f7353eb3df018558020610710d8908"} Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.220432 4763 generic.go:334] "Generic (PLEG): container finished" podID="03256f54-78cd-480c-9b33-36030e821372" containerID="f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76" exitCode=0 Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.220465 4763 generic.go:334] "Generic (PLEG): container finished" podID="03256f54-78cd-480c-9b33-36030e821372" containerID="efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b" exitCode=2 Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.220474 4763 generic.go:334] "Generic (PLEG): container finished" podID="03256f54-78cd-480c-9b33-36030e821372" containerID="68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb" exitCode=0 Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.220474 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03256f54-78cd-480c-9b33-36030e821372","Type":"ContainerDied","Data":"f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76"} Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.220522 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03256f54-78cd-480c-9b33-36030e821372","Type":"ContainerDied","Data":"efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b"} Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.220541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03256f54-78cd-480c-9b33-36030e821372","Type":"ContainerDied","Data":"68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb"} Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.227900 4763 generic.go:334] "Generic (PLEG): container finished" podID="8b629946-f324-44b2-a40f-fcef64ee4766" containerID="e01c05d2c451fef7ff0f3b5fbec07d8d7fe6fdefbd56476079b9af62d60685a6" exitCode=0 Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.227951 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ca15-account-create-x7kw9" event={"ID":"8b629946-f324-44b2-a40f-fcef64ee4766","Type":"ContainerDied","Data":"e01c05d2c451fef7ff0f3b5fbec07d8d7fe6fdefbd56476079b9af62d60685a6"} Oct 06 15:13:06 crc kubenswrapper[4763]: I1006 15:13:06.227977 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ca15-account-create-x7kw9" event={"ID":"8b629946-f324-44b2-a40f-fcef64ee4766","Type":"ContainerStarted","Data":"e7c10e14c8f1452be8831b55fae065bf8c4f53627349fc0732bbdd0e10ea58be"} Oct 06 15:13:07 crc kubenswrapper[4763]: I1006 15:13:07.848136 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee6c-account-create-mk2rc" Oct 06 15:13:07 crc kubenswrapper[4763]: I1006 15:13:07.862061 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ca15-account-create-x7kw9" Oct 06 15:13:07 crc kubenswrapper[4763]: I1006 15:13:07.925179 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsk8z\" (UniqueName: \"kubernetes.io/projected/d01055e0-1ab7-4f74-89b5-1227aa259394-kube-api-access-nsk8z\") pod \"d01055e0-1ab7-4f74-89b5-1227aa259394\" (UID: \"d01055e0-1ab7-4f74-89b5-1227aa259394\") " Oct 06 15:13:07 crc kubenswrapper[4763]: I1006 15:13:07.926972 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d01055e0-1ab7-4f74-89b5-1227aa259394-kube-api-access-nsk8z" (OuterVolumeSpecName: "kube-api-access-nsk8z") pod "d01055e0-1ab7-4f74-89b5-1227aa259394" (UID: "d01055e0-1ab7-4f74-89b5-1227aa259394"). InnerVolumeSpecName "kube-api-access-nsk8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:07 crc kubenswrapper[4763]: I1006 15:13:07.979263 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.027717 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j299\" (UniqueName: \"kubernetes.io/projected/8b629946-f324-44b2-a40f-fcef64ee4766-kube-api-access-6j299\") pod \"8b629946-f324-44b2-a40f-fcef64ee4766\" (UID: \"8b629946-f324-44b2-a40f-fcef64ee4766\") " Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.028124 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsk8z\" (UniqueName: \"kubernetes.io/projected/d01055e0-1ab7-4f74-89b5-1227aa259394-kube-api-access-nsk8z\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.030836 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b629946-f324-44b2-a40f-fcef64ee4766-kube-api-access-6j299" (OuterVolumeSpecName: "kube-api-access-6j299") pod "8b629946-f324-44b2-a40f-fcef64ee4766" (UID: "8b629946-f324-44b2-a40f-fcef64ee4766"). InnerVolumeSpecName "kube-api-access-6j299". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.129339 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-combined-ca-bundle\") pod \"03256f54-78cd-480c-9b33-36030e821372\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.129405 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-config-data\") pod \"03256f54-78cd-480c-9b33-36030e821372\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.129474 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-scripts\") pod \"03256f54-78cd-480c-9b33-36030e821372\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.129583 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-log-httpd\") pod \"03256f54-78cd-480c-9b33-36030e821372\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.129660 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-sg-core-conf-yaml\") pod \"03256f54-78cd-480c-9b33-36030e821372\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.129697 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6mc5\" (UniqueName: \"kubernetes.io/projected/03256f54-78cd-480c-9b33-36030e821372-kube-api-access-b6mc5\") pod \"03256f54-78cd-480c-9b33-36030e821372\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.130099 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "03256f54-78cd-480c-9b33-36030e821372" (UID: "03256f54-78cd-480c-9b33-36030e821372"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.130282 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-run-httpd\") pod \"03256f54-78cd-480c-9b33-36030e821372\" (UID: \"03256f54-78cd-480c-9b33-36030e821372\") " Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.130540 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "03256f54-78cd-480c-9b33-36030e821372" (UID: "03256f54-78cd-480c-9b33-36030e821372"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.131286 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j299\" (UniqueName: \"kubernetes.io/projected/8b629946-f324-44b2-a40f-fcef64ee4766-kube-api-access-6j299\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.131315 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.131327 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03256f54-78cd-480c-9b33-36030e821372-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.134290 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03256f54-78cd-480c-9b33-36030e821372-kube-api-access-b6mc5" (OuterVolumeSpecName: "kube-api-access-b6mc5") pod "03256f54-78cd-480c-9b33-36030e821372" (UID: "03256f54-78cd-480c-9b33-36030e821372"). InnerVolumeSpecName "kube-api-access-b6mc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.138163 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-scripts" (OuterVolumeSpecName: "scripts") pod "03256f54-78cd-480c-9b33-36030e821372" (UID: "03256f54-78cd-480c-9b33-36030e821372"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.155197 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "03256f54-78cd-480c-9b33-36030e821372" (UID: "03256f54-78cd-480c-9b33-36030e821372"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.221401 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-config-data" (OuterVolumeSpecName: "config-data") pod "03256f54-78cd-480c-9b33-36030e821372" (UID: "03256f54-78cd-480c-9b33-36030e821372"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.232730 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.232755 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.232871 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.232882 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6mc5\" (UniqueName: \"kubernetes.io/projected/03256f54-78cd-480c-9b33-36030e821372-kube-api-access-b6mc5\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.232847 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03256f54-78cd-480c-9b33-36030e821372" (UID: "03256f54-78cd-480c-9b33-36030e821372"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.251746 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ca15-account-create-x7kw9" event={"ID":"8b629946-f324-44b2-a40f-fcef64ee4766","Type":"ContainerDied","Data":"e7c10e14c8f1452be8831b55fae065bf8c4f53627349fc0732bbdd0e10ea58be"} Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.251780 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ca15-account-create-x7kw9" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.251790 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c10e14c8f1452be8831b55fae065bf8c4f53627349fc0732bbdd0e10ea58be" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.253130 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee6c-account-create-mk2rc" event={"ID":"d01055e0-1ab7-4f74-89b5-1227aa259394","Type":"ContainerDied","Data":"baa641191f649facaa61fcdb113a6f8be4f7353eb3df018558020610710d8908"} Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.253163 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baa641191f649facaa61fcdb113a6f8be4f7353eb3df018558020610710d8908" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.253195 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee6c-account-create-mk2rc" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.255632 4763 generic.go:334] "Generic (PLEG): container finished" podID="03256f54-78cd-480c-9b33-36030e821372" containerID="a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32" exitCode=0 Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.255666 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03256f54-78cd-480c-9b33-36030e821372","Type":"ContainerDied","Data":"a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32"} Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.255686 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03256f54-78cd-480c-9b33-36030e821372","Type":"ContainerDied","Data":"72ec964cf7db221aceab66a05beb6eab9b180d0e2e1002dc98bec1e2bc72d2b9"} Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.255695 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.255705 4763 scope.go:117] "RemoveContainer" containerID="f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.278315 4763 scope.go:117] "RemoveContainer" containerID="efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.292030 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.300350 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.317727 4763 scope.go:117] "RemoveContainer" containerID="68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.331946 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:08 crc kubenswrapper[4763]: E1006 15:13:08.332375 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="sg-core" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332389 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="sg-core" Oct 06 15:13:08 crc kubenswrapper[4763]: E1006 15:13:08.332411 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="ceilometer-notification-agent" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332419 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="ceilometer-notification-agent" Oct 06 15:13:08 crc kubenswrapper[4763]: E1006 15:13:08.332439 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="proxy-httpd" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332449 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="proxy-httpd" Oct 06 15:13:08 crc kubenswrapper[4763]: E1006 15:13:08.332463 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="ceilometer-central-agent" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332470 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="ceilometer-central-agent" Oct 06 15:13:08 crc kubenswrapper[4763]: E1006 15:13:08.332485 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d01055e0-1ab7-4f74-89b5-1227aa259394" containerName="mariadb-account-create" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332493 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d01055e0-1ab7-4f74-89b5-1227aa259394" containerName="mariadb-account-create" Oct 06 15:13:08 crc kubenswrapper[4763]: E1006 15:13:08.332518 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b629946-f324-44b2-a40f-fcef64ee4766" containerName="mariadb-account-create" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332525 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b629946-f324-44b2-a40f-fcef64ee4766" containerName="mariadb-account-create" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332820 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d01055e0-1ab7-4f74-89b5-1227aa259394" containerName="mariadb-account-create" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332839 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="proxy-httpd" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332853 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b629946-f324-44b2-a40f-fcef64ee4766" containerName="mariadb-account-create" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332866 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="ceilometer-notification-agent" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332884 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="ceilometer-central-agent" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.332900 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="03256f54-78cd-480c-9b33-36030e821372" containerName="sg-core" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.334463 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03256f54-78cd-480c-9b33-36030e821372-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.334769 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.339975 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.339978 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.340745 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.365151 4763 scope.go:117] "RemoveContainer" containerID="a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.381682 4763 scope.go:117] "RemoveContainer" containerID="f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76" Oct 06 15:13:08 crc kubenswrapper[4763]: E1006 15:13:08.382141 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76\": container with ID starting with f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76 not found: ID does not exist" containerID="f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.382201 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76"} err="failed to get container status \"f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76\": rpc error: code = NotFound desc = could not find container \"f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76\": container with ID starting with f28b40bc494a6af5aeeb73538a8434254e371d4214fc0b653d7a20a6bbf64f76 not found: ID does not exist" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.382226 4763 scope.go:117] "RemoveContainer" containerID="efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b" Oct 06 15:13:08 crc kubenswrapper[4763]: E1006 15:13:08.382469 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b\": container with ID starting with efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b not found: ID does not exist" containerID="efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.382498 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b"} err="failed to get container status \"efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b\": rpc error: code = NotFound desc = could not find container \"efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b\": container with ID starting with efdb0d931524789c5a5a5cfbeb2b2e8388da58eccaf319d96317318260085a6b not found: ID does not exist" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.382517 4763 scope.go:117] "RemoveContainer" containerID="68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb" Oct 06 15:13:08 crc kubenswrapper[4763]: E1006 15:13:08.382860 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb\": container with ID starting with 68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb not found: ID does not exist" containerID="68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.382884 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb"} err="failed to get container status \"68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb\": rpc error: code = NotFound desc = could not find container \"68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb\": container with ID starting with 68122f3d28c85d428c914d898c57644d75291b8ac86f9f5a0cd7e2d62ccfcafb not found: ID does not exist" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.382923 4763 scope.go:117] "RemoveContainer" containerID="a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32" Oct 06 15:13:08 crc kubenswrapper[4763]: E1006 15:13:08.383164 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32\": container with ID starting with a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32 not found: ID does not exist" containerID="a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.383189 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32"} err="failed to get container status \"a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32\": rpc error: code = NotFound desc = could not find container \"a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32\": container with ID starting with a533095ce2f804780f03a9d147bf5a18394623681ab8afef2fed92c1ea95dd32 not found: ID does not exist" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.435843 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7mpx\" (UniqueName: \"kubernetes.io/projected/bf726051-bc53-4906-8fd6-b7d8761ee110-kube-api-access-j7mpx\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.435961 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-config-data\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.435994 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.436028 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-log-httpd\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.436047 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-run-httpd\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.436070 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-scripts\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.436084 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.537956 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7mpx\" (UniqueName: \"kubernetes.io/projected/bf726051-bc53-4906-8fd6-b7d8761ee110-kube-api-access-j7mpx\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.538076 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-config-data\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.538117 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.538139 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-log-httpd\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.538176 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-run-httpd\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.538226 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-scripts\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.538252 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.539243 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-run-httpd\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.539501 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-log-httpd\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.542329 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-config-data\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.545179 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.546588 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-scripts\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.551825 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.555002 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7mpx\" (UniqueName: \"kubernetes.io/projected/bf726051-bc53-4906-8fd6-b7d8761ee110-kube-api-access-j7mpx\") pod \"ceilometer-0\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " pod="openstack/ceilometer-0" Oct 06 15:13:08 crc kubenswrapper[4763]: I1006 15:13:08.650187 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:09 crc kubenswrapper[4763]: W1006 15:13:09.100226 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf726051_bc53_4906_8fd6_b7d8761ee110.slice/crio-2b794c6a54d5c7e44e8e3f13bbc2e05d930d156401f6783a6c071c4d41739cfa WatchSource:0}: Error finding container 2b794c6a54d5c7e44e8e3f13bbc2e05d930d156401f6783a6c071c4d41739cfa: Status 404 returned error can't find the container with id 2b794c6a54d5c7e44e8e3f13bbc2e05d930d156401f6783a6c071c4d41739cfa Oct 06 15:13:09 crc kubenswrapper[4763]: I1006 15:13:09.104149 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:09 crc kubenswrapper[4763]: I1006 15:13:09.269307 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf726051-bc53-4906-8fd6-b7d8761ee110","Type":"ContainerStarted","Data":"2b794c6a54d5c7e44e8e3f13bbc2e05d930d156401f6783a6c071c4d41739cfa"} Oct 06 15:13:09 crc kubenswrapper[4763]: I1006 15:13:09.599904 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03256f54-78cd-480c-9b33-36030e821372" path="/var/lib/kubelet/pods/03256f54-78cd-480c-9b33-36030e821372/volumes" Oct 06 15:13:09 crc kubenswrapper[4763]: I1006 15:13:09.950301 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qqmkx"] Oct 06 15:13:09 crc kubenswrapper[4763]: I1006 15:13:09.952654 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:09 crc kubenswrapper[4763]: I1006 15:13:09.954221 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gtvc2" Oct 06 15:13:09 crc kubenswrapper[4763]: I1006 15:13:09.954661 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 06 15:13:09 crc kubenswrapper[4763]: I1006 15:13:09.955018 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 15:13:09 crc kubenswrapper[4763]: I1006 15:13:09.970923 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qqmkx"] Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.063658 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-config-data\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.063720 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.063749 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-scripts\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.064119 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjcc8\" (UniqueName: \"kubernetes.io/projected/af23f79f-d083-4a30-81fb-8f2791aa17eb-kube-api-access-vjcc8\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.166136 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-config-data\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.166491 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.166522 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-scripts\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.166705 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjcc8\" (UniqueName: \"kubernetes.io/projected/af23f79f-d083-4a30-81fb-8f2791aa17eb-kube-api-access-vjcc8\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.176696 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.176714 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-config-data\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.179429 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-scripts\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.189386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjcc8\" (UniqueName: \"kubernetes.io/projected/af23f79f-d083-4a30-81fb-8f2791aa17eb-kube-api-access-vjcc8\") pod \"nova-cell0-conductor-db-sync-qqmkx\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.279744 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.280625 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf726051-bc53-4906-8fd6-b7d8761ee110","Type":"ContainerStarted","Data":"0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89"} Oct 06 15:13:10 crc kubenswrapper[4763]: I1006 15:13:10.739189 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qqmkx"] Oct 06 15:13:10 crc kubenswrapper[4763]: W1006 15:13:10.746436 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf23f79f_d083_4a30_81fb_8f2791aa17eb.slice/crio-46836d84d07440db0faa2015e360cd1a0a7b914820339b471cbacbfbeacdfe63 WatchSource:0}: Error finding container 46836d84d07440db0faa2015e360cd1a0a7b914820339b471cbacbfbeacdfe63: Status 404 returned error can't find the container with id 46836d84d07440db0faa2015e360cd1a0a7b914820339b471cbacbfbeacdfe63 Oct 06 15:13:11 crc kubenswrapper[4763]: I1006 15:13:11.290348 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qqmkx" event={"ID":"af23f79f-d083-4a30-81fb-8f2791aa17eb","Type":"ContainerStarted","Data":"46836d84d07440db0faa2015e360cd1a0a7b914820339b471cbacbfbeacdfe63"} Oct 06 15:13:11 crc kubenswrapper[4763]: I1006 15:13:11.293052 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf726051-bc53-4906-8fd6-b7d8761ee110","Type":"ContainerStarted","Data":"ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e"} Oct 06 15:13:12 crc kubenswrapper[4763]: I1006 15:13:12.305427 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf726051-bc53-4906-8fd6-b7d8761ee110","Type":"ContainerStarted","Data":"a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894"} Oct 06 15:13:15 crc kubenswrapper[4763]: I1006 15:13:15.461468 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:18 crc kubenswrapper[4763]: I1006 15:13:18.014658 4763 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7ae39c8a-af64-489c-bb82-27c32c21279e"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7ae39c8a-af64-489c-bb82-27c32c21279e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7ae39c8a_af64_489c_bb82_27c32c21279e.slice" Oct 06 15:13:19 crc kubenswrapper[4763]: I1006 15:13:19.379601 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qqmkx" event={"ID":"af23f79f-d083-4a30-81fb-8f2791aa17eb","Type":"ContainerStarted","Data":"6a9b20147a6347915ae66b7027ee7a03bca0f91c2b7745f8b0f3b2ac537af3dd"} Oct 06 15:13:19 crc kubenswrapper[4763]: I1006 15:13:19.384332 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf726051-bc53-4906-8fd6-b7d8761ee110","Type":"ContainerStarted","Data":"b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c"} Oct 06 15:13:19 crc kubenswrapper[4763]: I1006 15:13:19.384445 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="ceilometer-central-agent" containerID="cri-o://0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89" gracePeriod=30 Oct 06 15:13:19 crc kubenswrapper[4763]: I1006 15:13:19.384477 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="sg-core" containerID="cri-o://a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894" gracePeriod=30 Oct 06 15:13:19 crc kubenswrapper[4763]: I1006 15:13:19.384525 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="ceilometer-notification-agent" containerID="cri-o://ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e" gracePeriod=30 Oct 06 15:13:19 crc kubenswrapper[4763]: I1006 15:13:19.384525 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 15:13:19 crc kubenswrapper[4763]: I1006 15:13:19.384547 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="proxy-httpd" containerID="cri-o://b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c" gracePeriod=30 Oct 06 15:13:19 crc kubenswrapper[4763]: I1006 15:13:19.415073 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qqmkx" podStartSLOduration=2.942301562 podStartE2EDuration="10.415052282s" podCreationTimestamp="2025-10-06 15:13:09 +0000 UTC" firstStartedPulling="2025-10-06 15:13:10.750416525 +0000 UTC m=+1187.905709057" lastFinishedPulling="2025-10-06 15:13:18.223167265 +0000 UTC m=+1195.378459777" observedRunningTime="2025-10-06 15:13:19.408148521 +0000 UTC m=+1196.563441043" watchObservedRunningTime="2025-10-06 15:13:19.415052282 +0000 UTC m=+1196.570344804" Oct 06 15:13:19 crc kubenswrapper[4763]: I1006 15:13:19.436703 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.326791262 podStartE2EDuration="11.436681768s" podCreationTimestamp="2025-10-06 15:13:08 +0000 UTC" firstStartedPulling="2025-10-06 15:13:09.102561233 +0000 UTC m=+1186.257853745" lastFinishedPulling="2025-10-06 15:13:18.212451739 +0000 UTC m=+1195.367744251" observedRunningTime="2025-10-06 15:13:19.429574792 +0000 UTC m=+1196.584867314" watchObservedRunningTime="2025-10-06 15:13:19.436681768 +0000 UTC m=+1196.591974280" Oct 06 15:13:20 crc kubenswrapper[4763]: I1006 15:13:20.399179 4763 generic.go:334] "Generic (PLEG): container finished" podID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerID="b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c" exitCode=0 Oct 06 15:13:20 crc kubenswrapper[4763]: I1006 15:13:20.399420 4763 generic.go:334] "Generic (PLEG): container finished" podID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerID="a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894" exitCode=2 Oct 06 15:13:20 crc kubenswrapper[4763]: I1006 15:13:20.399430 4763 generic.go:334] "Generic (PLEG): container finished" podID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerID="0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89" exitCode=0 Oct 06 15:13:20 crc kubenswrapper[4763]: I1006 15:13:20.399277 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf726051-bc53-4906-8fd6-b7d8761ee110","Type":"ContainerDied","Data":"b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c"} Oct 06 15:13:20 crc kubenswrapper[4763]: I1006 15:13:20.399555 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf726051-bc53-4906-8fd6-b7d8761ee110","Type":"ContainerDied","Data":"a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894"} Oct 06 15:13:20 crc kubenswrapper[4763]: I1006 15:13:20.399645 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf726051-bc53-4906-8fd6-b7d8761ee110","Type":"ContainerDied","Data":"0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89"} Oct 06 15:13:20 crc kubenswrapper[4763]: I1006 15:13:20.935376 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.091168 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7mpx\" (UniqueName: \"kubernetes.io/projected/bf726051-bc53-4906-8fd6-b7d8761ee110-kube-api-access-j7mpx\") pod \"bf726051-bc53-4906-8fd6-b7d8761ee110\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.091234 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-combined-ca-bundle\") pod \"bf726051-bc53-4906-8fd6-b7d8761ee110\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.091296 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-log-httpd\") pod \"bf726051-bc53-4906-8fd6-b7d8761ee110\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.091379 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-scripts\") pod \"bf726051-bc53-4906-8fd6-b7d8761ee110\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.091428 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-run-httpd\") pod \"bf726051-bc53-4906-8fd6-b7d8761ee110\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.091466 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-config-data\") pod \"bf726051-bc53-4906-8fd6-b7d8761ee110\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.091481 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-sg-core-conf-yaml\") pod \"bf726051-bc53-4906-8fd6-b7d8761ee110\" (UID: \"bf726051-bc53-4906-8fd6-b7d8761ee110\") " Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.092020 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bf726051-bc53-4906-8fd6-b7d8761ee110" (UID: "bf726051-bc53-4906-8fd6-b7d8761ee110"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.092104 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bf726051-bc53-4906-8fd6-b7d8761ee110" (UID: "bf726051-bc53-4906-8fd6-b7d8761ee110"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.097385 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf726051-bc53-4906-8fd6-b7d8761ee110-kube-api-access-j7mpx" (OuterVolumeSpecName: "kube-api-access-j7mpx") pod "bf726051-bc53-4906-8fd6-b7d8761ee110" (UID: "bf726051-bc53-4906-8fd6-b7d8761ee110"). InnerVolumeSpecName "kube-api-access-j7mpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.107860 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-scripts" (OuterVolumeSpecName: "scripts") pod "bf726051-bc53-4906-8fd6-b7d8761ee110" (UID: "bf726051-bc53-4906-8fd6-b7d8761ee110"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.117962 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bf726051-bc53-4906-8fd6-b7d8761ee110" (UID: "bf726051-bc53-4906-8fd6-b7d8761ee110"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.191518 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf726051-bc53-4906-8fd6-b7d8761ee110" (UID: "bf726051-bc53-4906-8fd6-b7d8761ee110"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.194072 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.194102 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.194144 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.194158 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7mpx\" (UniqueName: \"kubernetes.io/projected/bf726051-bc53-4906-8fd6-b7d8761ee110-kube-api-access-j7mpx\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.194168 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.194179 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf726051-bc53-4906-8fd6-b7d8761ee110-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.252833 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-config-data" (OuterVolumeSpecName: "config-data") pod "bf726051-bc53-4906-8fd6-b7d8761ee110" (UID: "bf726051-bc53-4906-8fd6-b7d8761ee110"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.295414 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf726051-bc53-4906-8fd6-b7d8761ee110-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.417698 4763 generic.go:334] "Generic (PLEG): container finished" podID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerID="ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e" exitCode=0 Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.417751 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf726051-bc53-4906-8fd6-b7d8761ee110","Type":"ContainerDied","Data":"ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e"} Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.417798 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.417833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf726051-bc53-4906-8fd6-b7d8761ee110","Type":"ContainerDied","Data":"2b794c6a54d5c7e44e8e3f13bbc2e05d930d156401f6783a6c071c4d41739cfa"} Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.417867 4763 scope.go:117] "RemoveContainer" containerID="b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.459184 4763 scope.go:117] "RemoveContainer" containerID="a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.469800 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.477647 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.504166 4763 scope.go:117] "RemoveContainer" containerID="ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.519835 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:21 crc kubenswrapper[4763]: E1006 15:13:21.520424 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="proxy-httpd" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.520453 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="proxy-httpd" Oct 06 15:13:21 crc kubenswrapper[4763]: E1006 15:13:21.520476 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="ceilometer-notification-agent" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.520491 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="ceilometer-notification-agent" Oct 06 15:13:21 crc kubenswrapper[4763]: E1006 15:13:21.520527 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="sg-core" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.520541 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="sg-core" Oct 06 15:13:21 crc kubenswrapper[4763]: E1006 15:13:21.520568 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="ceilometer-central-agent" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.520580 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="ceilometer-central-agent" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.521013 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="ceilometer-central-agent" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.521049 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="ceilometer-notification-agent" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.521079 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="proxy-httpd" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.521113 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" containerName="sg-core" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.524605 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.526888 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.527024 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.534044 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.604495 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-scripts\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.604528 4763 scope.go:117] "RemoveContainer" containerID="0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.604547 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.604596 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrdpm\" (UniqueName: \"kubernetes.io/projected/d17ae0bf-7064-4085-9b58-069a7c56ed7d-kube-api-access-rrdpm\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.604654 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-run-httpd\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.604683 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-log-httpd\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.604726 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.604932 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-config-data\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.605714 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf726051-bc53-4906-8fd6-b7d8761ee110" path="/var/lib/kubelet/pods/bf726051-bc53-4906-8fd6-b7d8761ee110/volumes" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.630702 4763 scope.go:117] "RemoveContainer" containerID="b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c" Oct 06 15:13:21 crc kubenswrapper[4763]: E1006 15:13:21.631109 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c\": container with ID starting with b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c not found: ID does not exist" containerID="b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.631143 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c"} err="failed to get container status \"b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c\": rpc error: code = NotFound desc = could not find container \"b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c\": container with ID starting with b0aca79e588e06551d69403e89510f88657405329e4b19409ba0d5713474709c not found: ID does not exist" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.631163 4763 scope.go:117] "RemoveContainer" containerID="a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894" Oct 06 15:13:21 crc kubenswrapper[4763]: E1006 15:13:21.631606 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894\": container with ID starting with a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894 not found: ID does not exist" containerID="a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.631688 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894"} err="failed to get container status \"a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894\": rpc error: code = NotFound desc = could not find container \"a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894\": container with ID starting with a489c50f4f1669c87e2cee32a3a2067583d37af2aa0b83f2c36e44466d74d894 not found: ID does not exist" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.631718 4763 scope.go:117] "RemoveContainer" containerID="ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e" Oct 06 15:13:21 crc kubenswrapper[4763]: E1006 15:13:21.632016 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e\": container with ID starting with ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e not found: ID does not exist" containerID="ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.632043 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e"} err="failed to get container status \"ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e\": rpc error: code = NotFound desc = could not find container \"ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e\": container with ID starting with ab5a3a1d167ff85685b0df4e476e440266bf191117ca475c5580ecc0b73d979e not found: ID does not exist" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.632060 4763 scope.go:117] "RemoveContainer" containerID="0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89" Oct 06 15:13:21 crc kubenswrapper[4763]: E1006 15:13:21.632265 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89\": container with ID starting with 0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89 not found: ID does not exist" containerID="0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.632290 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89"} err="failed to get container status \"0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89\": rpc error: code = NotFound desc = could not find container \"0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89\": container with ID starting with 0942d8231eed658a248e5375e55b860dcd6cfc59495dd21e1d5dffbac1ea3b89 not found: ID does not exist" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.707095 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrdpm\" (UniqueName: \"kubernetes.io/projected/d17ae0bf-7064-4085-9b58-069a7c56ed7d-kube-api-access-rrdpm\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.707164 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-run-httpd\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.707234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-log-httpd\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.707262 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.707311 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-config-data\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.707532 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-scripts\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.707550 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.709432 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-run-httpd\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.709964 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-log-httpd\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.712809 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.712887 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-scripts\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.713051 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.713283 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-config-data\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.723276 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrdpm\" (UniqueName: \"kubernetes.io/projected/d17ae0bf-7064-4085-9b58-069a7c56ed7d-kube-api-access-rrdpm\") pod \"ceilometer-0\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " pod="openstack/ceilometer-0" Oct 06 15:13:21 crc kubenswrapper[4763]: I1006 15:13:21.917097 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:13:22 crc kubenswrapper[4763]: I1006 15:13:22.417268 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:22 crc kubenswrapper[4763]: W1006 15:13:22.425754 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd17ae0bf_7064_4085_9b58_069a7c56ed7d.slice/crio-3ad92d0f4b35d31ad834491c0bb8d4cec80a3e4bd6642644912f37218c397c53 WatchSource:0}: Error finding container 3ad92d0f4b35d31ad834491c0bb8d4cec80a3e4bd6642644912f37218c397c53: Status 404 returned error can't find the container with id 3ad92d0f4b35d31ad834491c0bb8d4cec80a3e4bd6642644912f37218c397c53 Oct 06 15:13:23 crc kubenswrapper[4763]: I1006 15:13:23.452580 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d17ae0bf-7064-4085-9b58-069a7c56ed7d","Type":"ContainerStarted","Data":"00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be"} Oct 06 15:13:23 crc kubenswrapper[4763]: I1006 15:13:23.453016 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d17ae0bf-7064-4085-9b58-069a7c56ed7d","Type":"ContainerStarted","Data":"3ad92d0f4b35d31ad834491c0bb8d4cec80a3e4bd6642644912f37218c397c53"} Oct 06 15:13:24 crc kubenswrapper[4763]: I1006 15:13:24.467568 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d17ae0bf-7064-4085-9b58-069a7c56ed7d","Type":"ContainerStarted","Data":"df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792"} Oct 06 15:13:25 crc kubenswrapper[4763]: I1006 15:13:25.483692 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d17ae0bf-7064-4085-9b58-069a7c56ed7d","Type":"ContainerStarted","Data":"3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8"} Oct 06 15:13:26 crc kubenswrapper[4763]: I1006 15:13:26.509386 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d17ae0bf-7064-4085-9b58-069a7c56ed7d","Type":"ContainerStarted","Data":"f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898"} Oct 06 15:13:26 crc kubenswrapper[4763]: I1006 15:13:26.510001 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 15:13:26 crc kubenswrapper[4763]: I1006 15:13:26.539156 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.814331101 podStartE2EDuration="5.539136672s" podCreationTimestamp="2025-10-06 15:13:21 +0000 UTC" firstStartedPulling="2025-10-06 15:13:22.431719937 +0000 UTC m=+1199.587012449" lastFinishedPulling="2025-10-06 15:13:26.156525468 +0000 UTC m=+1203.311818020" observedRunningTime="2025-10-06 15:13:26.534658129 +0000 UTC m=+1203.689950651" watchObservedRunningTime="2025-10-06 15:13:26.539136672 +0000 UTC m=+1203.694429204" Oct 06 15:13:28 crc kubenswrapper[4763]: I1006 15:13:28.529517 4763 generic.go:334] "Generic (PLEG): container finished" podID="af23f79f-d083-4a30-81fb-8f2791aa17eb" containerID="6a9b20147a6347915ae66b7027ee7a03bca0f91c2b7745f8b0f3b2ac537af3dd" exitCode=0 Oct 06 15:13:28 crc kubenswrapper[4763]: I1006 15:13:28.529803 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qqmkx" event={"ID":"af23f79f-d083-4a30-81fb-8f2791aa17eb","Type":"ContainerDied","Data":"6a9b20147a6347915ae66b7027ee7a03bca0f91c2b7745f8b0f3b2ac537af3dd"} Oct 06 15:13:29 crc kubenswrapper[4763]: I1006 15:13:29.914712 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.074080 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-combined-ca-bundle\") pod \"af23f79f-d083-4a30-81fb-8f2791aa17eb\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.074479 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjcc8\" (UniqueName: \"kubernetes.io/projected/af23f79f-d083-4a30-81fb-8f2791aa17eb-kube-api-access-vjcc8\") pod \"af23f79f-d083-4a30-81fb-8f2791aa17eb\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.074846 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-scripts\") pod \"af23f79f-d083-4a30-81fb-8f2791aa17eb\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.076875 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-config-data\") pod \"af23f79f-d083-4a30-81fb-8f2791aa17eb\" (UID: \"af23f79f-d083-4a30-81fb-8f2791aa17eb\") " Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.084146 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-scripts" (OuterVolumeSpecName: "scripts") pod "af23f79f-d083-4a30-81fb-8f2791aa17eb" (UID: "af23f79f-d083-4a30-81fb-8f2791aa17eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.088350 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af23f79f-d083-4a30-81fb-8f2791aa17eb-kube-api-access-vjcc8" (OuterVolumeSpecName: "kube-api-access-vjcc8") pod "af23f79f-d083-4a30-81fb-8f2791aa17eb" (UID: "af23f79f-d083-4a30-81fb-8f2791aa17eb"). InnerVolumeSpecName "kube-api-access-vjcc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.127817 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af23f79f-d083-4a30-81fb-8f2791aa17eb" (UID: "af23f79f-d083-4a30-81fb-8f2791aa17eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.154112 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-config-data" (OuterVolumeSpecName: "config-data") pod "af23f79f-d083-4a30-81fb-8f2791aa17eb" (UID: "af23f79f-d083-4a30-81fb-8f2791aa17eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.180816 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.180855 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.180878 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjcc8\" (UniqueName: \"kubernetes.io/projected/af23f79f-d083-4a30-81fb-8f2791aa17eb-kube-api-access-vjcc8\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.180890 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af23f79f-d083-4a30-81fb-8f2791aa17eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.552985 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qqmkx" event={"ID":"af23f79f-d083-4a30-81fb-8f2791aa17eb","Type":"ContainerDied","Data":"46836d84d07440db0faa2015e360cd1a0a7b914820339b471cbacbfbeacdfe63"} Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.553051 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46836d84d07440db0faa2015e360cd1a0a7b914820339b471cbacbfbeacdfe63" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.553090 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qqmkx" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.668692 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 15:13:30 crc kubenswrapper[4763]: E1006 15:13:30.669587 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af23f79f-d083-4a30-81fb-8f2791aa17eb" containerName="nova-cell0-conductor-db-sync" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.669634 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="af23f79f-d083-4a30-81fb-8f2791aa17eb" containerName="nova-cell0-conductor-db-sync" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.669963 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="af23f79f-d083-4a30-81fb-8f2791aa17eb" containerName="nova-cell0-conductor-db-sync" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.670867 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.674179 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gtvc2" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.674403 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.691809 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.792759 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.792835 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.792866 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phb8l\" (UniqueName: \"kubernetes.io/projected/bcb3dd44-b7c9-4653-930a-113565fccec1-kube-api-access-phb8l\") pod \"nova-cell0-conductor-0\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.894885 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.894981 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phb8l\" (UniqueName: \"kubernetes.io/projected/bcb3dd44-b7c9-4653-930a-113565fccec1-kube-api-access-phb8l\") pod \"nova-cell0-conductor-0\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.895340 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.910810 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.912165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:30 crc kubenswrapper[4763]: I1006 15:13:30.925267 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phb8l\" (UniqueName: \"kubernetes.io/projected/bcb3dd44-b7c9-4653-930a-113565fccec1-kube-api-access-phb8l\") pod \"nova-cell0-conductor-0\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:31 crc kubenswrapper[4763]: I1006 15:13:31.005688 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:31 crc kubenswrapper[4763]: I1006 15:13:31.567226 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 15:13:32 crc kubenswrapper[4763]: I1006 15:13:32.574393 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcb3dd44-b7c9-4653-930a-113565fccec1","Type":"ContainerStarted","Data":"a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773"} Oct 06 15:13:32 crc kubenswrapper[4763]: I1006 15:13:32.574680 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcb3dd44-b7c9-4653-930a-113565fccec1","Type":"ContainerStarted","Data":"d0fb7d133d0c3b61106d7b6aa1d97eb16a0c44b0a5e1c7aeee243b0c8b5ca131"} Oct 06 15:13:32 crc kubenswrapper[4763]: I1006 15:13:32.592262 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.592242357 podStartE2EDuration="2.592242357s" podCreationTimestamp="2025-10-06 15:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:13:32.590701655 +0000 UTC m=+1209.745994167" watchObservedRunningTime="2025-10-06 15:13:32.592242357 +0000 UTC m=+1209.747534869" Oct 06 15:13:33 crc kubenswrapper[4763]: I1006 15:13:33.597235 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:33 crc kubenswrapper[4763]: I1006 15:13:33.877127 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:13:33 crc kubenswrapper[4763]: I1006 15:13:33.877501 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.054273 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.500482 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-l75n6"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.501643 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.503378 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.503458 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.537765 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-l75n6"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.616687 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-config-data\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.616822 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-scripts\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.616897 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.616927 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9jjc\" (UniqueName: \"kubernetes.io/projected/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-kube-api-access-h9jjc\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.707085 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.708418 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.716994 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.717232 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.718996 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.719389 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-config-data\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.719439 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-scripts\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.719471 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.719491 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9jjc\" (UniqueName: \"kubernetes.io/projected/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-kube-api-access-h9jjc\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.757463 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.769127 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-scripts\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.769408 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-config-data\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.777469 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.806343 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9jjc\" (UniqueName: \"kubernetes.io/projected/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-kube-api-access-h9jjc\") pod \"nova-cell0-cell-mapping-l75n6\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.822219 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9rnd\" (UniqueName: \"kubernetes.io/projected/bd365b7f-22e1-423f-a22b-d6638661cbb5-kube-api-access-d9rnd\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.822302 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9mrv\" (UniqueName: \"kubernetes.io/projected/bb8aab9b-6a2e-456b-8597-34e44d98e725-kube-api-access-n9mrv\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.822325 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8aab9b-6a2e-456b-8597-34e44d98e725-logs\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.822374 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-config-data\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.822400 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.822442 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd365b7f-22e1-423f-a22b-d6638661cbb5-logs\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.822464 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.822488 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-config-data\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.828032 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.859247 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.903446 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.924530 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd365b7f-22e1-423f-a22b-d6638661cbb5-logs\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.924585 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.924628 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-config-data\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.924690 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9rnd\" (UniqueName: \"kubernetes.io/projected/bd365b7f-22e1-423f-a22b-d6638661cbb5-kube-api-access-d9rnd\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.924726 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9mrv\" (UniqueName: \"kubernetes.io/projected/bb8aab9b-6a2e-456b-8597-34e44d98e725-kube-api-access-n9mrv\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.924744 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8aab9b-6a2e-456b-8597-34e44d98e725-logs\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.924784 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-config-data\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.924803 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.925148 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd365b7f-22e1-423f-a22b-d6638661cbb5-logs\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.925406 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.925739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8aab9b-6a2e-456b-8597-34e44d98e725-logs\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.926777 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.930570 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.931023 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.931481 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-config-data\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.931742 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-config-data\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.934128 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.941229 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9rnd\" (UniqueName: \"kubernetes.io/projected/bd365b7f-22e1-423f-a22b-d6638661cbb5-kube-api-access-d9rnd\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.948657 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.953556 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9mrv\" (UniqueName: \"kubernetes.io/projected/bb8aab9b-6a2e-456b-8597-34e44d98e725-kube-api-access-n9mrv\") pod \"nova-metadata-0\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.954412 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.959801 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-b9vb2"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.961466 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.967953 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.982551 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-b9vb2"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.988580 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.990516 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:13:36 crc kubenswrapper[4763]: I1006 15:13:36.993095 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.004936 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-config-data\") pod \"nova-scheduler-0\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028243 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5snz\" (UniqueName: \"kubernetes.io/projected/86597dc8-a99b-4220-a89d-eb7e8a117546-kube-api-access-r5snz\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028271 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028286 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028308 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028329 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-config\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028370 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028383 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjkxk\" (UniqueName: \"kubernetes.io/projected/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-kube-api-access-xjkxk\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfvx\" (UniqueName: \"kubernetes.io/projected/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-kube-api-access-ngfvx\") pod \"nova-scheduler-0\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028462 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028483 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.028521 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.131782 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-config-data\") pod \"nova-scheduler-0\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132065 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5snz\" (UniqueName: \"kubernetes.io/projected/86597dc8-a99b-4220-a89d-eb7e8a117546-kube-api-access-r5snz\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132094 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132111 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132132 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132150 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-config\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132175 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132189 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkxk\" (UniqueName: \"kubernetes.io/projected/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-kube-api-access-xjkxk\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132220 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfvx\" (UniqueName: \"kubernetes.io/projected/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-kube-api-access-ngfvx\") pod \"nova-scheduler-0\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132249 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132268 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.132294 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.134837 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.135386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.136139 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.136463 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-config\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.136631 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.136977 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.140702 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-config-data\") pod \"nova-scheduler-0\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.140933 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.141254 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.152165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjkxk\" (UniqueName: \"kubernetes.io/projected/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-kube-api-access-xjkxk\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.152207 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5snz\" (UniqueName: \"kubernetes.io/projected/86597dc8-a99b-4220-a89d-eb7e8a117546-kube-api-access-r5snz\") pod \"dnsmasq-dns-845d6d6f59-b9vb2\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.156929 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfvx\" (UniqueName: \"kubernetes.io/projected/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-kube-api-access-ngfvx\") pod \"nova-scheduler-0\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.296368 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.315459 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.325369 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.412190 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-l75n6"] Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.509263 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.548240 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-94w6q"] Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.549447 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.554214 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.558116 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.599721 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-94w6q"] Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.602548 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:37 crc kubenswrapper[4763]: W1006 15:13:37.604838 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb8aab9b_6a2e_456b_8597_34e44d98e725.slice/crio-11e961c4ff4ea632514fb621dc896f69308028ff7d81b877eb1f25ab4a821582 WatchSource:0}: Error finding container 11e961c4ff4ea632514fb621dc896f69308028ff7d81b877eb1f25ab4a821582: Status 404 returned error can't find the container with id 11e961c4ff4ea632514fb621dc896f69308028ff7d81b877eb1f25ab4a821582 Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.642722 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd365b7f-22e1-423f-a22b-d6638661cbb5","Type":"ContainerStarted","Data":"65115602481f4d31165ea2ab2db64f97f83ff762cc38132c0cfe81d46414eb2d"} Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.646653 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfs4g\" (UniqueName: \"kubernetes.io/projected/325c5548-0def-436c-9c96-326c91d9b06e-kube-api-access-bfs4g\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.646737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-scripts\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.646825 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.646847 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-config-data\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.649457 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb8aab9b-6a2e-456b-8597-34e44d98e725","Type":"ContainerStarted","Data":"11e961c4ff4ea632514fb621dc896f69308028ff7d81b877eb1f25ab4a821582"} Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.650899 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l75n6" event={"ID":"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae","Type":"ContainerStarted","Data":"9a8460dcfe307412784d55425949b5cddb2757cb17c32f9dd94c731d4f504a8d"} Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.748900 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfs4g\" (UniqueName: \"kubernetes.io/projected/325c5548-0def-436c-9c96-326c91d9b06e-kube-api-access-bfs4g\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.748965 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-scripts\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.749025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.749048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-config-data\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.756879 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-config-data\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.757173 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-scripts\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.757868 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.770251 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfs4g\" (UniqueName: \"kubernetes.io/projected/325c5548-0def-436c-9c96-326c91d9b06e-kube-api-access-bfs4g\") pod \"nova-cell1-conductor-db-sync-94w6q\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.875232 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.903459 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.971221 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:13:37 crc kubenswrapper[4763]: I1006 15:13:37.979178 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-b9vb2"] Oct 06 15:13:37 crc kubenswrapper[4763]: W1006 15:13:37.998862 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86597dc8_a99b_4220_a89d_eb7e8a117546.slice/crio-cc003ae6ae2cc56cb50f4f2c9fc78faca4e39c95b9627d449a7bd1965f7fb66f WatchSource:0}: Error finding container cc003ae6ae2cc56cb50f4f2c9fc78faca4e39c95b9627d449a7bd1965f7fb66f: Status 404 returned error can't find the container with id cc003ae6ae2cc56cb50f4f2c9fc78faca4e39c95b9627d449a7bd1965f7fb66f Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.380572 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-94w6q"] Oct 06 15:13:38 crc kubenswrapper[4763]: W1006 15:13:38.386246 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod325c5548_0def_436c_9c96_326c91d9b06e.slice/crio-24925a7b3b48152b8ac23695a86e038a962c05b22ffc3041197b4f36d64d6d16 WatchSource:0}: Error finding container 24925a7b3b48152b8ac23695a86e038a962c05b22ffc3041197b4f36d64d6d16: Status 404 returned error can't find the container with id 24925a7b3b48152b8ac23695a86e038a962c05b22ffc3041197b4f36d64d6d16 Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.662486 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1","Type":"ContainerStarted","Data":"5f2f8f07ceee57edf7b0666754cf30784ffe8b9ecd9b7a750b102a6696b41da7"} Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.669400 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l75n6" event={"ID":"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae","Type":"ContainerStarted","Data":"bde4aed9fd9a6c5eefae7c1cc33564849b0279bade7ce7e70e2159cd7431f7f7"} Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.672383 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-94w6q" event={"ID":"325c5548-0def-436c-9c96-326c91d9b06e","Type":"ContainerStarted","Data":"1a19de1d681a2184e6bc1c868810837a0a80d978dbe170b9eb27d56fe034b52f"} Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.672423 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-94w6q" event={"ID":"325c5548-0def-436c-9c96-326c91d9b06e","Type":"ContainerStarted","Data":"24925a7b3b48152b8ac23695a86e038a962c05b22ffc3041197b4f36d64d6d16"} Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.674568 4763 generic.go:334] "Generic (PLEG): container finished" podID="86597dc8-a99b-4220-a89d-eb7e8a117546" containerID="f120c52d33a0f9d6e29cb913ca8a9664d3a0d2f8ad98c0347d03b9ff605947bd" exitCode=0 Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.674630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" event={"ID":"86597dc8-a99b-4220-a89d-eb7e8a117546","Type":"ContainerDied","Data":"f120c52d33a0f9d6e29cb913ca8a9664d3a0d2f8ad98c0347d03b9ff605947bd"} Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.674729 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" event={"ID":"86597dc8-a99b-4220-a89d-eb7e8a117546","Type":"ContainerStarted","Data":"cc003ae6ae2cc56cb50f4f2c9fc78faca4e39c95b9627d449a7bd1965f7fb66f"} Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.677934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e","Type":"ContainerStarted","Data":"bd3715936f2b8b5b8117560b41b2470b062b366e8167286ea6c309f6722d6b8a"} Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.699660 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-l75n6" podStartSLOduration=2.69964138 podStartE2EDuration="2.69964138s" podCreationTimestamp="2025-10-06 15:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:13:38.683557857 +0000 UTC m=+1215.838850359" watchObservedRunningTime="2025-10-06 15:13:38.69964138 +0000 UTC m=+1215.854933892" Oct 06 15:13:38 crc kubenswrapper[4763]: I1006 15:13:38.736524 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-94w6q" podStartSLOduration=1.736505376 podStartE2EDuration="1.736505376s" podCreationTimestamp="2025-10-06 15:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:13:38.725709159 +0000 UTC m=+1215.881001681" watchObservedRunningTime="2025-10-06 15:13:38.736505376 +0000 UTC m=+1215.891797888" Oct 06 15:13:39 crc kubenswrapper[4763]: I1006 15:13:39.696934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" event={"ID":"86597dc8-a99b-4220-a89d-eb7e8a117546","Type":"ContainerStarted","Data":"ccd62700fafdf3a6f1684e5b5db825da9d723dfeda19d35346fbeaad12324951"} Oct 06 15:13:39 crc kubenswrapper[4763]: I1006 15:13:39.697365 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:39 crc kubenswrapper[4763]: I1006 15:13:39.718327 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" podStartSLOduration=3.718305522 podStartE2EDuration="3.718305522s" podCreationTimestamp="2025-10-06 15:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:13:39.713923212 +0000 UTC m=+1216.869215724" watchObservedRunningTime="2025-10-06 15:13:39.718305522 +0000 UTC m=+1216.873598034" Oct 06 15:13:40 crc kubenswrapper[4763]: I1006 15:13:40.699609 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:40 crc kubenswrapper[4763]: I1006 15:13:40.714048 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.722911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb8aab9b-6a2e-456b-8597-34e44d98e725","Type":"ContainerStarted","Data":"c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487"} Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.724463 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb8aab9b-6a2e-456b-8597-34e44d98e725","Type":"ContainerStarted","Data":"81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4"} Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.723021 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bb8aab9b-6a2e-456b-8597-34e44d98e725" containerName="nova-metadata-metadata" containerID="cri-o://c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487" gracePeriod=30 Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.722988 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bb8aab9b-6a2e-456b-8597-34e44d98e725" containerName="nova-metadata-log" containerID="cri-o://81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4" gracePeriod=30 Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.724944 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1","Type":"ContainerStarted","Data":"126667a9436f0492743db23237eed57cacad4240ef3e8dadf49cb9b3691ef0dc"} Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.729032 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd365b7f-22e1-423f-a22b-d6638661cbb5","Type":"ContainerStarted","Data":"821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051"} Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.729078 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd365b7f-22e1-423f-a22b-d6638661cbb5","Type":"ContainerStarted","Data":"37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8"} Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.732485 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="5b1f6ac6-57fb-4897-a563-5823c8ed8d3e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a" gracePeriod=30 Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.732582 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e","Type":"ContainerStarted","Data":"02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a"} Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.763148 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.796825979 podStartE2EDuration="5.763131186s" podCreationTimestamp="2025-10-06 15:13:36 +0000 UTC" firstStartedPulling="2025-10-06 15:13:37.892744573 +0000 UTC m=+1215.048037085" lastFinishedPulling="2025-10-06 15:13:40.85904978 +0000 UTC m=+1218.014342292" observedRunningTime="2025-10-06 15:13:41.76185412 +0000 UTC m=+1218.917146652" watchObservedRunningTime="2025-10-06 15:13:41.763131186 +0000 UTC m=+1218.918423698" Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.767257 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.518490128 podStartE2EDuration="5.767248009s" podCreationTimestamp="2025-10-06 15:13:36 +0000 UTC" firstStartedPulling="2025-10-06 15:13:37.609261421 +0000 UTC m=+1214.764553933" lastFinishedPulling="2025-10-06 15:13:40.858019302 +0000 UTC m=+1218.013311814" observedRunningTime="2025-10-06 15:13:41.742098316 +0000 UTC m=+1218.897390858" watchObservedRunningTime="2025-10-06 15:13:41.767248009 +0000 UTC m=+1218.922540521" Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.797534 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.503925166 podStartE2EDuration="5.797514323s" podCreationTimestamp="2025-10-06 15:13:36 +0000 UTC" firstStartedPulling="2025-10-06 15:13:37.566571294 +0000 UTC m=+1214.721863806" lastFinishedPulling="2025-10-06 15:13:40.860160451 +0000 UTC m=+1218.015452963" observedRunningTime="2025-10-06 15:13:41.782625473 +0000 UTC m=+1218.937918005" watchObservedRunningTime="2025-10-06 15:13:41.797514323 +0000 UTC m=+1218.952806835" Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.799311 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.92060544 podStartE2EDuration="5.799304123s" podCreationTimestamp="2025-10-06 15:13:36 +0000 UTC" firstStartedPulling="2025-10-06 15:13:37.986718843 +0000 UTC m=+1215.142011365" lastFinishedPulling="2025-10-06 15:13:40.865417526 +0000 UTC m=+1218.020710048" observedRunningTime="2025-10-06 15:13:41.794339916 +0000 UTC m=+1218.949632438" watchObservedRunningTime="2025-10-06 15:13:41.799304123 +0000 UTC m=+1218.954596635" Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.969245 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 15:13:41 crc kubenswrapper[4763]: I1006 15:13:41.969306 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.296976 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.301977 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.326204 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.336937 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9mrv\" (UniqueName: \"kubernetes.io/projected/bb8aab9b-6a2e-456b-8597-34e44d98e725-kube-api-access-n9mrv\") pod \"bb8aab9b-6a2e-456b-8597-34e44d98e725\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.337103 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-config-data\") pod \"bb8aab9b-6a2e-456b-8597-34e44d98e725\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.337208 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-combined-ca-bundle\") pod \"bb8aab9b-6a2e-456b-8597-34e44d98e725\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.337230 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8aab9b-6a2e-456b-8597-34e44d98e725-logs\") pod \"bb8aab9b-6a2e-456b-8597-34e44d98e725\" (UID: \"bb8aab9b-6a2e-456b-8597-34e44d98e725\") " Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.337736 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8aab9b-6a2e-456b-8597-34e44d98e725-logs" (OuterVolumeSpecName: "logs") pod "bb8aab9b-6a2e-456b-8597-34e44d98e725" (UID: "bb8aab9b-6a2e-456b-8597-34e44d98e725"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.342857 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8aab9b-6a2e-456b-8597-34e44d98e725-kube-api-access-n9mrv" (OuterVolumeSpecName: "kube-api-access-n9mrv") pod "bb8aab9b-6a2e-456b-8597-34e44d98e725" (UID: "bb8aab9b-6a2e-456b-8597-34e44d98e725"). InnerVolumeSpecName "kube-api-access-n9mrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.368137 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-config-data" (OuterVolumeSpecName: "config-data") pod "bb8aab9b-6a2e-456b-8597-34e44d98e725" (UID: "bb8aab9b-6a2e-456b-8597-34e44d98e725"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.370543 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb8aab9b-6a2e-456b-8597-34e44d98e725" (UID: "bb8aab9b-6a2e-456b-8597-34e44d98e725"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.442109 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.442140 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8aab9b-6a2e-456b-8597-34e44d98e725-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.442153 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8aab9b-6a2e-456b-8597-34e44d98e725-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.442164 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9mrv\" (UniqueName: \"kubernetes.io/projected/bb8aab9b-6a2e-456b-8597-34e44d98e725-kube-api-access-n9mrv\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.742357 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb8aab9b-6a2e-456b-8597-34e44d98e725" containerID="c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487" exitCode=0 Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.742798 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb8aab9b-6a2e-456b-8597-34e44d98e725" containerID="81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4" exitCode=143 Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.742429 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb8aab9b-6a2e-456b-8597-34e44d98e725","Type":"ContainerDied","Data":"c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487"} Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.742409 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.742921 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb8aab9b-6a2e-456b-8597-34e44d98e725","Type":"ContainerDied","Data":"81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4"} Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.742951 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb8aab9b-6a2e-456b-8597-34e44d98e725","Type":"ContainerDied","Data":"11e961c4ff4ea632514fb621dc896f69308028ff7d81b877eb1f25ab4a821582"} Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.742970 4763 scope.go:117] "RemoveContainer" containerID="c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.780473 4763 scope.go:117] "RemoveContainer" containerID="81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.781382 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.795342 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.818588 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:42 crc kubenswrapper[4763]: E1006 15:13:42.819111 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8aab9b-6a2e-456b-8597-34e44d98e725" containerName="nova-metadata-metadata" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.819135 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8aab9b-6a2e-456b-8597-34e44d98e725" containerName="nova-metadata-metadata" Oct 06 15:13:42 crc kubenswrapper[4763]: E1006 15:13:42.819190 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8aab9b-6a2e-456b-8597-34e44d98e725" containerName="nova-metadata-log" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.819199 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8aab9b-6a2e-456b-8597-34e44d98e725" containerName="nova-metadata-log" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.819431 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8aab9b-6a2e-456b-8597-34e44d98e725" containerName="nova-metadata-log" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.819464 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8aab9b-6a2e-456b-8597-34e44d98e725" containerName="nova-metadata-metadata" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.820454 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.826244 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.826293 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.841756 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.845646 4763 scope.go:117] "RemoveContainer" containerID="c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487" Oct 06 15:13:42 crc kubenswrapper[4763]: E1006 15:13:42.847135 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487\": container with ID starting with c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487 not found: ID does not exist" containerID="c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.847178 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487"} err="failed to get container status \"c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487\": rpc error: code = NotFound desc = could not find container \"c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487\": container with ID starting with c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487 not found: ID does not exist" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.847233 4763 scope.go:117] "RemoveContainer" containerID="81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4" Oct 06 15:13:42 crc kubenswrapper[4763]: E1006 15:13:42.847554 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4\": container with ID starting with 81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4 not found: ID does not exist" containerID="81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.847587 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4"} err="failed to get container status \"81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4\": rpc error: code = NotFound desc = could not find container \"81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4\": container with ID starting with 81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4 not found: ID does not exist" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.848433 4763 scope.go:117] "RemoveContainer" containerID="c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.849892 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-config-data\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.849961 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.850526 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.850578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-logs\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.850676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bmwd\" (UniqueName: \"kubernetes.io/projected/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-kube-api-access-6bmwd\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.851114 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487"} err="failed to get container status \"c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487\": rpc error: code = NotFound desc = could not find container \"c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487\": container with ID starting with c6429fa90d1a1278c5cb0831cea3665bee2d85c6b8aa5727f92641f7385f1487 not found: ID does not exist" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.851175 4763 scope.go:117] "RemoveContainer" containerID="81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.851737 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4"} err="failed to get container status \"81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4\": rpc error: code = NotFound desc = could not find container \"81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4\": container with ID starting with 81f77f533c8685af69e400687f55887222554b2ec9890de7db137c2035d55ab4 not found: ID does not exist" Oct 06 15:13:42 crc kubenswrapper[4763]: E1006 15:13:42.902461 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb8aab9b_6a2e_456b_8597_34e44d98e725.slice\": RecentStats: unable to find data in memory cache]" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.953351 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-config-data\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.953404 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.953502 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.953521 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-logs\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.953555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bmwd\" (UniqueName: \"kubernetes.io/projected/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-kube-api-access-6bmwd\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.954141 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-logs\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.959927 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-config-data\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.960165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.968608 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:42 crc kubenswrapper[4763]: I1006 15:13:42.969663 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bmwd\" (UniqueName: \"kubernetes.io/projected/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-kube-api-access-6bmwd\") pod \"nova-metadata-0\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " pod="openstack/nova-metadata-0" Oct 06 15:13:43 crc kubenswrapper[4763]: I1006 15:13:43.147712 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:13:43 crc kubenswrapper[4763]: I1006 15:13:43.594997 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8aab9b-6a2e-456b-8597-34e44d98e725" path="/var/lib/kubelet/pods/bb8aab9b-6a2e-456b-8597-34e44d98e725/volumes" Oct 06 15:13:43 crc kubenswrapper[4763]: I1006 15:13:43.596107 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:43 crc kubenswrapper[4763]: I1006 15:13:43.755572 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"caccd23e-e9cb-47dc-80e4-c3cf6810a90a","Type":"ContainerStarted","Data":"0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37"} Oct 06 15:13:43 crc kubenswrapper[4763]: I1006 15:13:43.755988 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"caccd23e-e9cb-47dc-80e4-c3cf6810a90a","Type":"ContainerStarted","Data":"642ea3161dd06b52685d2b02cede866f7dc772176212b2fc600d8f54a489690c"} Oct 06 15:13:44 crc kubenswrapper[4763]: I1006 15:13:44.781234 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"caccd23e-e9cb-47dc-80e4-c3cf6810a90a","Type":"ContainerStarted","Data":"60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219"} Oct 06 15:13:44 crc kubenswrapper[4763]: I1006 15:13:44.783634 4763 generic.go:334] "Generic (PLEG): container finished" podID="67e3da3e-5e14-48a9-bb24-cc1a0b6fadae" containerID="bde4aed9fd9a6c5eefae7c1cc33564849b0279bade7ce7e70e2159cd7431f7f7" exitCode=0 Oct 06 15:13:44 crc kubenswrapper[4763]: I1006 15:13:44.783684 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l75n6" event={"ID":"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae","Type":"ContainerDied","Data":"bde4aed9fd9a6c5eefae7c1cc33564849b0279bade7ce7e70e2159cd7431f7f7"} Oct 06 15:13:44 crc kubenswrapper[4763]: I1006 15:13:44.799414 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.799396211 podStartE2EDuration="2.799396211s" podCreationTimestamp="2025-10-06 15:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:13:44.79608076 +0000 UTC m=+1221.951373292" watchObservedRunningTime="2025-10-06 15:13:44.799396211 +0000 UTC m=+1221.954688723" Oct 06 15:13:45 crc kubenswrapper[4763]: I1006 15:13:45.798834 4763 generic.go:334] "Generic (PLEG): container finished" podID="325c5548-0def-436c-9c96-326c91d9b06e" containerID="1a19de1d681a2184e6bc1c868810837a0a80d978dbe170b9eb27d56fe034b52f" exitCode=0 Oct 06 15:13:45 crc kubenswrapper[4763]: I1006 15:13:45.798953 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-94w6q" event={"ID":"325c5548-0def-436c-9c96-326c91d9b06e","Type":"ContainerDied","Data":"1a19de1d681a2184e6bc1c868810837a0a80d978dbe170b9eb27d56fe034b52f"} Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.205844 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.213328 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9jjc\" (UniqueName: \"kubernetes.io/projected/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-kube-api-access-h9jjc\") pod \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.221560 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-kube-api-access-h9jjc" (OuterVolumeSpecName: "kube-api-access-h9jjc") pod "67e3da3e-5e14-48a9-bb24-cc1a0b6fadae" (UID: "67e3da3e-5e14-48a9-bb24-cc1a0b6fadae"). InnerVolumeSpecName "kube-api-access-h9jjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.314373 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-combined-ca-bundle\") pod \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.314443 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-scripts\") pod \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.314524 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-config-data\") pod \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\" (UID: \"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae\") " Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.314966 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9jjc\" (UniqueName: \"kubernetes.io/projected/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-kube-api-access-h9jjc\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.327127 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-scripts" (OuterVolumeSpecName: "scripts") pod "67e3da3e-5e14-48a9-bb24-cc1a0b6fadae" (UID: "67e3da3e-5e14-48a9-bb24-cc1a0b6fadae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.348252 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67e3da3e-5e14-48a9-bb24-cc1a0b6fadae" (UID: "67e3da3e-5e14-48a9-bb24-cc1a0b6fadae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.352162 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-config-data" (OuterVolumeSpecName: "config-data") pod "67e3da3e-5e14-48a9-bb24-cc1a0b6fadae" (UID: "67e3da3e-5e14-48a9-bb24-cc1a0b6fadae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.417102 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.417138 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.417155 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.819206 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l75n6" event={"ID":"67e3da3e-5e14-48a9-bb24-cc1a0b6fadae","Type":"ContainerDied","Data":"9a8460dcfe307412784d55425949b5cddb2757cb17c32f9dd94c731d4f504a8d"} Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.819255 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l75n6" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.819270 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a8460dcfe307412784d55425949b5cddb2757cb17c32f9dd94c731d4f504a8d" Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.856129 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.856660 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bd365b7f-22e1-423f-a22b-d6638661cbb5" containerName="nova-api-log" containerID="cri-o://37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8" gracePeriod=30 Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.857081 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bd365b7f-22e1-423f-a22b-d6638661cbb5" containerName="nova-api-api" containerID="cri-o://821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051" gracePeriod=30 Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.914980 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.915285 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8a8b95e5-bb0d-474a-a5b6-817570e9f2b1" containerName="nova-scheduler-scheduler" containerID="cri-o://126667a9436f0492743db23237eed57cacad4240ef3e8dadf49cb9b3691ef0dc" gracePeriod=30 Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.926231 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.926772 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" containerName="nova-metadata-log" containerID="cri-o://0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37" gracePeriod=30 Oct 06 15:13:46 crc kubenswrapper[4763]: I1006 15:13:46.927319 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" containerName="nova-metadata-metadata" containerID="cri-o://60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219" gracePeriod=30 Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.249644 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.316781 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.377309 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-nft57"] Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.377705 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-nft57" podUID="85ec42c3-f313-4218-b323-55d9e5d0a78c" containerName="dnsmasq-dns" containerID="cri-o://dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553" gracePeriod=10 Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.437127 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-combined-ca-bundle\") pod \"325c5548-0def-436c-9c96-326c91d9b06e\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.437263 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-config-data\") pod \"325c5548-0def-436c-9c96-326c91d9b06e\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.437342 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-scripts\") pod \"325c5548-0def-436c-9c96-326c91d9b06e\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.437458 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfs4g\" (UniqueName: \"kubernetes.io/projected/325c5548-0def-436c-9c96-326c91d9b06e-kube-api-access-bfs4g\") pod \"325c5548-0def-436c-9c96-326c91d9b06e\" (UID: \"325c5548-0def-436c-9c96-326c91d9b06e\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.443117 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-scripts" (OuterVolumeSpecName: "scripts") pod "325c5548-0def-436c-9c96-326c91d9b06e" (UID: "325c5548-0def-436c-9c96-326c91d9b06e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.447525 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/325c5548-0def-436c-9c96-326c91d9b06e-kube-api-access-bfs4g" (OuterVolumeSpecName: "kube-api-access-bfs4g") pod "325c5548-0def-436c-9c96-326c91d9b06e" (UID: "325c5548-0def-436c-9c96-326c91d9b06e"). InnerVolumeSpecName "kube-api-access-bfs4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.469485 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-config-data" (OuterVolumeSpecName: "config-data") pod "325c5548-0def-436c-9c96-326c91d9b06e" (UID: "325c5548-0def-436c-9c96-326c91d9b06e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.477291 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "325c5548-0def-436c-9c96-326c91d9b06e" (UID: "325c5548-0def-436c-9c96-326c91d9b06e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.540794 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.540995 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.541096 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfs4g\" (UniqueName: \"kubernetes.io/projected/325c5548-0def-436c-9c96-326c91d9b06e-kube-api-access-bfs4g\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.541216 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325c5548-0def-436c-9c96-326c91d9b06e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.587634 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.595699 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.743708 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-config-data\") pod \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.743968 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-nova-metadata-tls-certs\") pod \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.743993 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-combined-ca-bundle\") pod \"bd365b7f-22e1-423f-a22b-d6638661cbb5\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.744079 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd365b7f-22e1-423f-a22b-d6638661cbb5-logs\") pod \"bd365b7f-22e1-423f-a22b-d6638661cbb5\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.744157 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-combined-ca-bundle\") pod \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.744199 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bmwd\" (UniqueName: \"kubernetes.io/projected/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-kube-api-access-6bmwd\") pod \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.744259 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-config-data\") pod \"bd365b7f-22e1-423f-a22b-d6638661cbb5\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.744296 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-logs\") pod \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\" (UID: \"caccd23e-e9cb-47dc-80e4-c3cf6810a90a\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.744324 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9rnd\" (UniqueName: \"kubernetes.io/projected/bd365b7f-22e1-423f-a22b-d6638661cbb5-kube-api-access-d9rnd\") pod \"bd365b7f-22e1-423f-a22b-d6638661cbb5\" (UID: \"bd365b7f-22e1-423f-a22b-d6638661cbb5\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.745929 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd365b7f-22e1-423f-a22b-d6638661cbb5-logs" (OuterVolumeSpecName: "logs") pod "bd365b7f-22e1-423f-a22b-d6638661cbb5" (UID: "bd365b7f-22e1-423f-a22b-d6638661cbb5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.747204 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-logs" (OuterVolumeSpecName: "logs") pod "caccd23e-e9cb-47dc-80e4-c3cf6810a90a" (UID: "caccd23e-e9cb-47dc-80e4-c3cf6810a90a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.757869 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd365b7f-22e1-423f-a22b-d6638661cbb5-kube-api-access-d9rnd" (OuterVolumeSpecName: "kube-api-access-d9rnd") pod "bd365b7f-22e1-423f-a22b-d6638661cbb5" (UID: "bd365b7f-22e1-423f-a22b-d6638661cbb5"). InnerVolumeSpecName "kube-api-access-d9rnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.758995 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-kube-api-access-6bmwd" (OuterVolumeSpecName: "kube-api-access-6bmwd") pod "caccd23e-e9cb-47dc-80e4-c3cf6810a90a" (UID: "caccd23e-e9cb-47dc-80e4-c3cf6810a90a"). InnerVolumeSpecName "kube-api-access-6bmwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.776398 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "caccd23e-e9cb-47dc-80e4-c3cf6810a90a" (UID: "caccd23e-e9cb-47dc-80e4-c3cf6810a90a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.785514 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-config-data" (OuterVolumeSpecName: "config-data") pod "caccd23e-e9cb-47dc-80e4-c3cf6810a90a" (UID: "caccd23e-e9cb-47dc-80e4-c3cf6810a90a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.788435 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-config-data" (OuterVolumeSpecName: "config-data") pod "bd365b7f-22e1-423f-a22b-d6638661cbb5" (UID: "bd365b7f-22e1-423f-a22b-d6638661cbb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.800719 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd365b7f-22e1-423f-a22b-d6638661cbb5" (UID: "bd365b7f-22e1-423f-a22b-d6638661cbb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.812505 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.847452 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9rnd\" (UniqueName: \"kubernetes.io/projected/bd365b7f-22e1-423f-a22b-d6638661cbb5-kube-api-access-d9rnd\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.847502 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.847527 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.847539 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd365b7f-22e1-423f-a22b-d6638661cbb5-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.847553 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.847569 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bmwd\" (UniqueName: \"kubernetes.io/projected/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-kube-api-access-6bmwd\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.847581 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd365b7f-22e1-423f-a22b-d6638661cbb5-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.847689 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.865386 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "caccd23e-e9cb-47dc-80e4-c3cf6810a90a" (UID: "caccd23e-e9cb-47dc-80e4-c3cf6810a90a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.870177 4763 generic.go:334] "Generic (PLEG): container finished" podID="bd365b7f-22e1-423f-a22b-d6638661cbb5" containerID="821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051" exitCode=0 Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.870250 4763 generic.go:334] "Generic (PLEG): container finished" podID="bd365b7f-22e1-423f-a22b-d6638661cbb5" containerID="37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8" exitCode=143 Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.870495 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.870673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd365b7f-22e1-423f-a22b-d6638661cbb5","Type":"ContainerDied","Data":"821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051"} Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.870906 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd365b7f-22e1-423f-a22b-d6638661cbb5","Type":"ContainerDied","Data":"37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8"} Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.870925 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd365b7f-22e1-423f-a22b-d6638661cbb5","Type":"ContainerDied","Data":"65115602481f4d31165ea2ab2db64f97f83ff762cc38132c0cfe81d46414eb2d"} Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.870944 4763 scope.go:117] "RemoveContainer" containerID="821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.875796 4763 generic.go:334] "Generic (PLEG): container finished" podID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" containerID="60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219" exitCode=0 Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.875824 4763 generic.go:334] "Generic (PLEG): container finished" podID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" containerID="0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37" exitCode=143 Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.875887 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"caccd23e-e9cb-47dc-80e4-c3cf6810a90a","Type":"ContainerDied","Data":"60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219"} Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.875911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"caccd23e-e9cb-47dc-80e4-c3cf6810a90a","Type":"ContainerDied","Data":"0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37"} Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.875921 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"caccd23e-e9cb-47dc-80e4-c3cf6810a90a","Type":"ContainerDied","Data":"642ea3161dd06b52685d2b02cede866f7dc772176212b2fc600d8f54a489690c"} Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.876005 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.881314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-94w6q" event={"ID":"325c5548-0def-436c-9c96-326c91d9b06e","Type":"ContainerDied","Data":"24925a7b3b48152b8ac23695a86e038a962c05b22ffc3041197b4f36d64d6d16"} Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.881357 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24925a7b3b48152b8ac23695a86e038a962c05b22ffc3041197b4f36d64d6d16" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.881902 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-94w6q" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.888629 4763 generic.go:334] "Generic (PLEG): container finished" podID="85ec42c3-f313-4218-b323-55d9e5d0a78c" containerID="dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553" exitCode=0 Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.888689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-nft57" event={"ID":"85ec42c3-f313-4218-b323-55d9e5d0a78c","Type":"ContainerDied","Data":"dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553"} Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.888720 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-nft57" event={"ID":"85ec42c3-f313-4218-b323-55d9e5d0a78c","Type":"ContainerDied","Data":"7f9b585cdc682976821929e9f5c8ffc13c10faaf582163293868b6711960e4c5"} Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.895527 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-nft57" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.912253 4763 scope.go:117] "RemoveContainer" containerID="37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.920338 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 15:13:47 crc kubenswrapper[4763]: E1006 15:13:47.920817 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" containerName="nova-metadata-log" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.920839 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" containerName="nova-metadata-log" Oct 06 15:13:47 crc kubenswrapper[4763]: E1006 15:13:47.920872 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd365b7f-22e1-423f-a22b-d6638661cbb5" containerName="nova-api-log" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.920881 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd365b7f-22e1-423f-a22b-d6638661cbb5" containerName="nova-api-log" Oct 06 15:13:47 crc kubenswrapper[4763]: E1006 15:13:47.920898 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e3da3e-5e14-48a9-bb24-cc1a0b6fadae" containerName="nova-manage" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.920906 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e3da3e-5e14-48a9-bb24-cc1a0b6fadae" containerName="nova-manage" Oct 06 15:13:47 crc kubenswrapper[4763]: E1006 15:13:47.921123 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" containerName="nova-metadata-metadata" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921141 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" containerName="nova-metadata-metadata" Oct 06 15:13:47 crc kubenswrapper[4763]: E1006 15:13:47.921165 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325c5548-0def-436c-9c96-326c91d9b06e" containerName="nova-cell1-conductor-db-sync" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921172 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="325c5548-0def-436c-9c96-326c91d9b06e" containerName="nova-cell1-conductor-db-sync" Oct 06 15:13:47 crc kubenswrapper[4763]: E1006 15:13:47.921183 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ec42c3-f313-4218-b323-55d9e5d0a78c" containerName="init" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921191 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ec42c3-f313-4218-b323-55d9e5d0a78c" containerName="init" Oct 06 15:13:47 crc kubenswrapper[4763]: E1006 15:13:47.921205 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ec42c3-f313-4218-b323-55d9e5d0a78c" containerName="dnsmasq-dns" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921214 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ec42c3-f313-4218-b323-55d9e5d0a78c" containerName="dnsmasq-dns" Oct 06 15:13:47 crc kubenswrapper[4763]: E1006 15:13:47.921229 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd365b7f-22e1-423f-a22b-d6638661cbb5" containerName="nova-api-api" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921237 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd365b7f-22e1-423f-a22b-d6638661cbb5" containerName="nova-api-api" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921495 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" containerName="nova-metadata-metadata" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921524 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" containerName="nova-metadata-log" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921535 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ec42c3-f313-4218-b323-55d9e5d0a78c" containerName="dnsmasq-dns" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921552 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd365b7f-22e1-423f-a22b-d6638661cbb5" containerName="nova-api-log" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921568 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="325c5548-0def-436c-9c96-326c91d9b06e" containerName="nova-cell1-conductor-db-sync" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921580 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd365b7f-22e1-423f-a22b-d6638661cbb5" containerName="nova-api-api" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.921596 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e3da3e-5e14-48a9-bb24-cc1a0b6fadae" containerName="nova-manage" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.922249 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.923496 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.941520 4763 scope.go:117] "RemoveContainer" containerID="821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.941823 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 15:13:47 crc kubenswrapper[4763]: E1006 15:13:47.941985 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051\": container with ID starting with 821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051 not found: ID does not exist" containerID="821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.942024 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051"} err="failed to get container status \"821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051\": rpc error: code = NotFound desc = could not find container \"821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051\": container with ID starting with 821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051 not found: ID does not exist" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.942049 4763 scope.go:117] "RemoveContainer" containerID="37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8" Oct 06 15:13:47 crc kubenswrapper[4763]: E1006 15:13:47.942436 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8\": container with ID starting with 37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8 not found: ID does not exist" containerID="37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.942462 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8"} err="failed to get container status \"37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8\": rpc error: code = NotFound desc = could not find container \"37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8\": container with ID starting with 37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8 not found: ID does not exist" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.942476 4763 scope.go:117] "RemoveContainer" containerID="821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.942725 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051"} err="failed to get container status \"821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051\": rpc error: code = NotFound desc = could not find container \"821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051\": container with ID starting with 821f8f5b4af2befc135816aab2a84975a2c49e5ccc3fd2c6d42f49ba227fd051 not found: ID does not exist" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.942744 4763 scope.go:117] "RemoveContainer" containerID="37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.942972 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8"} err="failed to get container status \"37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8\": rpc error: code = NotFound desc = could not find container \"37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8\": container with ID starting with 37a28867d20847d4892d4c59e26f45e99b178d311dc954b8d651cd6e5f2307b8 not found: ID does not exist" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.942991 4763 scope.go:117] "RemoveContainer" containerID="60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.948310 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-svc\") pod \"85ec42c3-f313-4218-b323-55d9e5d0a78c\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.948356 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-nb\") pod \"85ec42c3-f313-4218-b323-55d9e5d0a78c\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.948648 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5pqn\" (UniqueName: \"kubernetes.io/projected/85ec42c3-f313-4218-b323-55d9e5d0a78c-kube-api-access-d5pqn\") pod \"85ec42c3-f313-4218-b323-55d9e5d0a78c\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.948816 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-config\") pod \"85ec42c3-f313-4218-b323-55d9e5d0a78c\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.948931 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-sb\") pod \"85ec42c3-f313-4218-b323-55d9e5d0a78c\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.948946 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-swift-storage-0\") pod \"85ec42c3-f313-4218-b323-55d9e5d0a78c\" (UID: \"85ec42c3-f313-4218-b323-55d9e5d0a78c\") " Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.949303 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/caccd23e-e9cb-47dc-80e4-c3cf6810a90a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.961516 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ec42c3-f313-4218-b323-55d9e5d0a78c-kube-api-access-d5pqn" (OuterVolumeSpecName: "kube-api-access-d5pqn") pod "85ec42c3-f313-4218-b323-55d9e5d0a78c" (UID: "85ec42c3-f313-4218-b323-55d9e5d0a78c"). InnerVolumeSpecName "kube-api-access-d5pqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.964993 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.972090 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.980525 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.987941 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.995384 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.996921 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.999431 4763 scope.go:117] "RemoveContainer" containerID="0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37" Oct 06 15:13:47 crc kubenswrapper[4763]: I1006 15:13:47.999455 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.002106 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.017275 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.026457 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "85ec42c3-f313-4218-b323-55d9e5d0a78c" (UID: "85ec42c3-f313-4218-b323-55d9e5d0a78c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.027946 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "85ec42c3-f313-4218-b323-55d9e5d0a78c" (UID: "85ec42c3-f313-4218-b323-55d9e5d0a78c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.031319 4763 scope.go:117] "RemoveContainer" containerID="60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219" Oct 06 15:13:48 crc kubenswrapper[4763]: E1006 15:13:48.032107 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219\": container with ID starting with 60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219 not found: ID does not exist" containerID="60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.032170 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219"} err="failed to get container status \"60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219\": rpc error: code = NotFound desc = could not find container \"60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219\": container with ID starting with 60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219 not found: ID does not exist" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.032193 4763 scope.go:117] "RemoveContainer" containerID="0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.032497 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 15:13:48 crc kubenswrapper[4763]: E1006 15:13:48.032817 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37\": container with ID starting with 0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37 not found: ID does not exist" containerID="0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.032844 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37"} err="failed to get container status \"0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37\": rpc error: code = NotFound desc = could not find container \"0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37\": container with ID starting with 0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37 not found: ID does not exist" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.032889 4763 scope.go:117] "RemoveContainer" containerID="60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.033252 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219"} err="failed to get container status \"60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219\": rpc error: code = NotFound desc = could not find container \"60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219\": container with ID starting with 60904e15c7bffb7f26908a2fad8d3705ca04b6f1d609db35e3be501219e64219 not found: ID does not exist" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.033269 4763 scope.go:117] "RemoveContainer" containerID="0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.034555 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.037545 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.037861 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37"} err="failed to get container status \"0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37\": rpc error: code = NotFound desc = could not find container \"0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37\": container with ID starting with 0edaa35a378ca1fabb6ec15498a83fd688685418f019fc078fae1d0102c20b37 not found: ID does not exist" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.037884 4763 scope.go:117] "RemoveContainer" containerID="dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.038909 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-config" (OuterVolumeSpecName: "config") pod "85ec42c3-f313-4218-b323-55d9e5d0a78c" (UID: "85ec42c3-f313-4218-b323-55d9e5d0a78c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.044265 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "85ec42c3-f313-4218-b323-55d9e5d0a78c" (UID: "85ec42c3-f313-4218-b323-55d9e5d0a78c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.049034 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85ec42c3-f313-4218-b323-55d9e5d0a78c" (UID: "85ec42c3-f313-4218-b323-55d9e5d0a78c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.051914 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn5tc\" (UniqueName: \"kubernetes.io/projected/45540131-5bd7-47c8-bab3-da9362ab3aa3-kube-api-access-jn5tc\") pod \"nova-cell1-conductor-0\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.052077 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.052221 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.052339 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.052353 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.052361 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.052369 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.052377 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5pqn\" (UniqueName: \"kubernetes.io/projected/85ec42c3-f313-4218-b323-55d9e5d0a78c-kube-api-access-d5pqn\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.052388 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec42c3-f313-4218-b323-55d9e5d0a78c-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.068507 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.069705 4763 scope.go:117] "RemoveContainer" containerID="791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.093900 4763 scope.go:117] "RemoveContainer" containerID="dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553" Oct 06 15:13:48 crc kubenswrapper[4763]: E1006 15:13:48.096002 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553\": container with ID starting with dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553 not found: ID does not exist" containerID="dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.096110 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553"} err="failed to get container status \"dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553\": rpc error: code = NotFound desc = could not find container \"dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553\": container with ID starting with dc3acb1b74efd373e397f3d666fa6a737d91a65cee7044e86d83953cea532553 not found: ID does not exist" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.096199 4763 scope.go:117] "RemoveContainer" containerID="791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80" Oct 06 15:13:48 crc kubenswrapper[4763]: E1006 15:13:48.096531 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80\": container with ID starting with 791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80 not found: ID does not exist" containerID="791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.096569 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80"} err="failed to get container status \"791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80\": rpc error: code = NotFound desc = could not find container \"791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80\": container with ID starting with 791b8ab088fb313886100b39ddf1b15076c55bd490ce5177d3636c436ceb4a80 not found: ID does not exist" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.154389 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.154648 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.154786 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.155019 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn5tc\" (UniqueName: \"kubernetes.io/projected/45540131-5bd7-47c8-bab3-da9362ab3aa3-kube-api-access-jn5tc\") pod \"nova-cell1-conductor-0\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.155164 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-config-data\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.155283 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7276637-816e-4a30-85a1-9968546dfc7d-logs\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.155400 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.155496 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h27n\" (UniqueName: \"kubernetes.io/projected/0624b1fc-a666-447b-af25-a231087b13ef-kube-api-access-2h27n\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.155658 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-config-data\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.155758 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0624b1fc-a666-447b-af25-a231087b13ef-logs\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.155838 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.155912 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26vxn\" (UniqueName: \"kubernetes.io/projected/a7276637-816e-4a30-85a1-9968546dfc7d-kube-api-access-26vxn\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.160536 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.161094 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.169465 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn5tc\" (UniqueName: \"kubernetes.io/projected/45540131-5bd7-47c8-bab3-da9362ab3aa3-kube-api-access-jn5tc\") pod \"nova-cell1-conductor-0\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.254546 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-nft57"] Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257102 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257251 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257376 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257382 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-config-data\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257535 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7276637-816e-4a30-85a1-9968546dfc7d-logs\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257661 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257753 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h27n\" (UniqueName: \"kubernetes.io/projected/0624b1fc-a666-447b-af25-a231087b13ef-kube-api-access-2h27n\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257780 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-config-data\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257807 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0624b1fc-a666-447b-af25-a231087b13ef-logs\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257849 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26vxn\" (UniqueName: \"kubernetes.io/projected/a7276637-816e-4a30-85a1-9968546dfc7d-kube-api-access-26vxn\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.257983 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7276637-816e-4a30-85a1-9968546dfc7d-logs\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.258343 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0624b1fc-a666-447b-af25-a231087b13ef-logs\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.261371 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-config-data\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.263452 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.264026 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-config-data\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.263877 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.264838 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.277223 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h27n\" (UniqueName: \"kubernetes.io/projected/0624b1fc-a666-447b-af25-a231087b13ef-kube-api-access-2h27n\") pod \"nova-api-0\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.282266 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26vxn\" (UniqueName: \"kubernetes.io/projected/a7276637-816e-4a30-85a1-9968546dfc7d-kube-api-access-26vxn\") pod \"nova-metadata-0\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.287896 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-nft57"] Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.315853 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.354535 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.775692 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.899555 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a8b95e5-bb0d-474a-a5b6-817570e9f2b1" containerID="126667a9436f0492743db23237eed57cacad4240ef3e8dadf49cb9b3691ef0dc" exitCode=0 Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.899639 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1","Type":"ContainerDied","Data":"126667a9436f0492743db23237eed57cacad4240ef3e8dadf49cb9b3691ef0dc"} Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.909184 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"45540131-5bd7-47c8-bab3-da9362ab3aa3","Type":"ContainerStarted","Data":"58f85118fbeb99b50c9ebc06e5f76b2d576ce5d3467a796f3f7d52a786c73ba1"} Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.916704 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.922162 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:13:48 crc kubenswrapper[4763]: I1006 15:13:48.929249 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:13:48 crc kubenswrapper[4763]: W1006 15:13:48.930299 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0624b1fc_a666_447b_af25_a231087b13ef.slice/crio-b4b9345093a0a28cf9c98eec456995bfa3c9196567cc20220170801d2b528065 WatchSource:0}: Error finding container b4b9345093a0a28cf9c98eec456995bfa3c9196567cc20220170801d2b528065: Status 404 returned error can't find the container with id b4b9345093a0a28cf9c98eec456995bfa3c9196567cc20220170801d2b528065 Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.084108 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-config-data\") pod \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.084288 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-combined-ca-bundle\") pod \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.084330 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngfvx\" (UniqueName: \"kubernetes.io/projected/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-kube-api-access-ngfvx\") pod \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\" (UID: \"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1\") " Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.088975 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-kube-api-access-ngfvx" (OuterVolumeSpecName: "kube-api-access-ngfvx") pod "8a8b95e5-bb0d-474a-a5b6-817570e9f2b1" (UID: "8a8b95e5-bb0d-474a-a5b6-817570e9f2b1"). InnerVolumeSpecName "kube-api-access-ngfvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.113560 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-config-data" (OuterVolumeSpecName: "config-data") pod "8a8b95e5-bb0d-474a-a5b6-817570e9f2b1" (UID: "8a8b95e5-bb0d-474a-a5b6-817570e9f2b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.128749 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a8b95e5-bb0d-474a-a5b6-817570e9f2b1" (UID: "8a8b95e5-bb0d-474a-a5b6-817570e9f2b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.186413 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.186449 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngfvx\" (UniqueName: \"kubernetes.io/projected/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-kube-api-access-ngfvx\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.186459 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.591406 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ec42c3-f313-4218-b323-55d9e5d0a78c" path="/var/lib/kubelet/pods/85ec42c3-f313-4218-b323-55d9e5d0a78c/volumes" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.592732 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd365b7f-22e1-423f-a22b-d6638661cbb5" path="/var/lib/kubelet/pods/bd365b7f-22e1-423f-a22b-d6638661cbb5/volumes" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.593699 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caccd23e-e9cb-47dc-80e4-c3cf6810a90a" path="/var/lib/kubelet/pods/caccd23e-e9cb-47dc-80e4-c3cf6810a90a/volumes" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.920167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a8b95e5-bb0d-474a-a5b6-817570e9f2b1","Type":"ContainerDied","Data":"5f2f8f07ceee57edf7b0666754cf30784ffe8b9ecd9b7a750b102a6696b41da7"} Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.920218 4763 scope.go:117] "RemoveContainer" containerID="126667a9436f0492743db23237eed57cacad4240ef3e8dadf49cb9b3691ef0dc" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.920370 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.923073 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"45540131-5bd7-47c8-bab3-da9362ab3aa3","Type":"ContainerStarted","Data":"c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d"} Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.924497 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.927437 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7276637-816e-4a30-85a1-9968546dfc7d","Type":"ContainerStarted","Data":"f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719"} Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.927476 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7276637-816e-4a30-85a1-9968546dfc7d","Type":"ContainerStarted","Data":"2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833"} Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.927491 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7276637-816e-4a30-85a1-9968546dfc7d","Type":"ContainerStarted","Data":"8db1c7e515f46fa2a5d398392ce7704f34918112c500df73e8606889cd188bf6"} Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.930958 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0624b1fc-a666-447b-af25-a231087b13ef","Type":"ContainerStarted","Data":"3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8"} Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.930997 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0624b1fc-a666-447b-af25-a231087b13ef","Type":"ContainerStarted","Data":"081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4"} Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.931011 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0624b1fc-a666-447b-af25-a231087b13ef","Type":"ContainerStarted","Data":"b4b9345093a0a28cf9c98eec456995bfa3c9196567cc20220170801d2b528065"} Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.960410 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.96038915 podStartE2EDuration="2.96038915s" podCreationTimestamp="2025-10-06 15:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:13:49.94185572 +0000 UTC m=+1227.097148232" watchObservedRunningTime="2025-10-06 15:13:49.96038915 +0000 UTC m=+1227.115681662" Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.977018 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.985489 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:13:49 crc kubenswrapper[4763]: I1006 15:13:49.990199 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.9901778610000003 podStartE2EDuration="2.990177861s" podCreationTimestamp="2025-10-06 15:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:13:49.972134064 +0000 UTC m=+1227.127426596" watchObservedRunningTime="2025-10-06 15:13:49.990177861 +0000 UTC m=+1227.145470373" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.014234 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:13:50 crc kubenswrapper[4763]: E1006 15:13:50.014571 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8b95e5-bb0d-474a-a5b6-817570e9f2b1" containerName="nova-scheduler-scheduler" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.014587 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8b95e5-bb0d-474a-a5b6-817570e9f2b1" containerName="nova-scheduler-scheduler" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.014804 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8b95e5-bb0d-474a-a5b6-817570e9f2b1" containerName="nova-scheduler-scheduler" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.015389 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.017643 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.022841 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.023080 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.023037137 podStartE2EDuration="3.023037137s" podCreationTimestamp="2025-10-06 15:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:13:49.988879306 +0000 UTC m=+1227.144171818" watchObservedRunningTime="2025-10-06 15:13:50.023037137 +0000 UTC m=+1227.178329649" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.102702 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xlmk\" (UniqueName: \"kubernetes.io/projected/5cdea547-7521-4ef5-ac59-a079224a4577-kube-api-access-2xlmk\") pod \"nova-scheduler-0\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.102768 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.103279 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-config-data\") pod \"nova-scheduler-0\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.205546 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-config-data\") pod \"nova-scheduler-0\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.205652 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xlmk\" (UniqueName: \"kubernetes.io/projected/5cdea547-7521-4ef5-ac59-a079224a4577-kube-api-access-2xlmk\") pod \"nova-scheduler-0\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.205694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.211593 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.215110 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-config-data\") pod \"nova-scheduler-0\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.227145 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xlmk\" (UniqueName: \"kubernetes.io/projected/5cdea547-7521-4ef5-ac59-a079224a4577-kube-api-access-2xlmk\") pod \"nova-scheduler-0\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.336675 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.832414 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:13:50 crc kubenswrapper[4763]: I1006 15:13:50.942377 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5cdea547-7521-4ef5-ac59-a079224a4577","Type":"ContainerStarted","Data":"8eada450cd30387121614f710fb08b0fc8effc99159c2393f509c3cee44fe499"} Oct 06 15:13:51 crc kubenswrapper[4763]: I1006 15:13:51.592595 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8b95e5-bb0d-474a-a5b6-817570e9f2b1" path="/var/lib/kubelet/pods/8a8b95e5-bb0d-474a-a5b6-817570e9f2b1/volumes" Oct 06 15:13:51 crc kubenswrapper[4763]: I1006 15:13:51.923287 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 15:13:51 crc kubenswrapper[4763]: I1006 15:13:51.972711 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5cdea547-7521-4ef5-ac59-a079224a4577","Type":"ContainerStarted","Data":"757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690"} Oct 06 15:13:52 crc kubenswrapper[4763]: I1006 15:13:52.004988 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.004959857 podStartE2EDuration="3.004959857s" podCreationTimestamp="2025-10-06 15:13:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:13:52.002097628 +0000 UTC m=+1229.157390150" watchObservedRunningTime="2025-10-06 15:13:52.004959857 +0000 UTC m=+1229.160252409" Oct 06 15:13:53 crc kubenswrapper[4763]: I1006 15:13:53.296069 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 06 15:13:53 crc kubenswrapper[4763]: I1006 15:13:53.316745 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 15:13:53 crc kubenswrapper[4763]: I1006 15:13:53.316809 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 15:13:55 crc kubenswrapper[4763]: I1006 15:13:55.337062 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 15:13:55 crc kubenswrapper[4763]: I1006 15:13:55.810066 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:13:55 crc kubenswrapper[4763]: I1006 15:13:55.810347 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2b82aad8-d8ea-4e3f-9b7f-973232a5ffac" containerName="kube-state-metrics" containerID="cri-o://993dfcf56ad01e08872a874c38355b60188342f405c463c9cfdb706de429021f" gracePeriod=30 Oct 06 15:13:56 crc kubenswrapper[4763]: I1006 15:13:56.029432 4763 generic.go:334] "Generic (PLEG): container finished" podID="2b82aad8-d8ea-4e3f-9b7f-973232a5ffac" containerID="993dfcf56ad01e08872a874c38355b60188342f405c463c9cfdb706de429021f" exitCode=2 Oct 06 15:13:56 crc kubenswrapper[4763]: I1006 15:13:56.029578 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b82aad8-d8ea-4e3f-9b7f-973232a5ffac","Type":"ContainerDied","Data":"993dfcf56ad01e08872a874c38355b60188342f405c463c9cfdb706de429021f"} Oct 06 15:13:56 crc kubenswrapper[4763]: I1006 15:13:56.343342 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 15:13:56 crc kubenswrapper[4763]: I1006 15:13:56.481716 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxlxc\" (UniqueName: \"kubernetes.io/projected/2b82aad8-d8ea-4e3f-9b7f-973232a5ffac-kube-api-access-bxlxc\") pod \"2b82aad8-d8ea-4e3f-9b7f-973232a5ffac\" (UID: \"2b82aad8-d8ea-4e3f-9b7f-973232a5ffac\") " Oct 06 15:13:56 crc kubenswrapper[4763]: I1006 15:13:56.511914 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b82aad8-d8ea-4e3f-9b7f-973232a5ffac-kube-api-access-bxlxc" (OuterVolumeSpecName: "kube-api-access-bxlxc") pod "2b82aad8-d8ea-4e3f-9b7f-973232a5ffac" (UID: "2b82aad8-d8ea-4e3f-9b7f-973232a5ffac"). InnerVolumeSpecName "kube-api-access-bxlxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:13:56 crc kubenswrapper[4763]: I1006 15:13:56.586172 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxlxc\" (UniqueName: \"kubernetes.io/projected/2b82aad8-d8ea-4e3f-9b7f-973232a5ffac-kube-api-access-bxlxc\") on node \"crc\" DevicePath \"\"" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.040099 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b82aad8-d8ea-4e3f-9b7f-973232a5ffac","Type":"ContainerDied","Data":"c616aa6561518f62d7511516d602dd87d23300e10e9d03e6cec6b42d7eb2ae74"} Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.040160 4763 scope.go:117] "RemoveContainer" containerID="993dfcf56ad01e08872a874c38355b60188342f405c463c9cfdb706de429021f" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.040300 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.077542 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.088160 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.103703 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:13:57 crc kubenswrapper[4763]: E1006 15:13:57.104155 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b82aad8-d8ea-4e3f-9b7f-973232a5ffac" containerName="kube-state-metrics" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.104179 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b82aad8-d8ea-4e3f-9b7f-973232a5ffac" containerName="kube-state-metrics" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.104440 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b82aad8-d8ea-4e3f-9b7f-973232a5ffac" containerName="kube-state-metrics" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.105421 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.109957 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.111001 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.114119 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.198173 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.198250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsxtd\" (UniqueName: \"kubernetes.io/projected/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-api-access-tsxtd\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.198280 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.198300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.299966 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.300390 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsxtd\" (UniqueName: \"kubernetes.io/projected/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-api-access-tsxtd\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.300520 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.300655 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.304217 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.306188 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.317183 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.327311 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsxtd\" (UniqueName: \"kubernetes.io/projected/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-api-access-tsxtd\") pod \"kube-state-metrics-0\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.426853 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.592969 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b82aad8-d8ea-4e3f-9b7f-973232a5ffac" path="/var/lib/kubelet/pods/2b82aad8-d8ea-4e3f-9b7f-973232a5ffac/volumes" Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.626515 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.626914 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="ceilometer-central-agent" containerID="cri-o://00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be" gracePeriod=30 Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.627402 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="proxy-httpd" containerID="cri-o://f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898" gracePeriod=30 Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.627541 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="ceilometer-notification-agent" containerID="cri-o://df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792" gracePeriod=30 Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.627594 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="sg-core" containerID="cri-o://3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8" gracePeriod=30 Oct 06 15:13:57 crc kubenswrapper[4763]: I1006 15:13:57.999488 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.050446 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0c33271c-af2f-43e4-adaf-9a81ef747ee5","Type":"ContainerStarted","Data":"1a14369cae17ccba796f132e19e47253c65fd8c6b6b2c692c0e7018201c68c89"} Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.052534 4763 generic.go:334] "Generic (PLEG): container finished" podID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerID="f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898" exitCode=0 Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.052564 4763 generic.go:334] "Generic (PLEG): container finished" podID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerID="3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8" exitCode=2 Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.052570 4763 generic.go:334] "Generic (PLEG): container finished" podID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerID="00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be" exitCode=0 Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.052611 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d17ae0bf-7064-4085-9b58-069a7c56ed7d","Type":"ContainerDied","Data":"f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898"} Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.052654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d17ae0bf-7064-4085-9b58-069a7c56ed7d","Type":"ContainerDied","Data":"3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8"} Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.052668 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d17ae0bf-7064-4085-9b58-069a7c56ed7d","Type":"ContainerDied","Data":"00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be"} Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.316903 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.317267 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.355587 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 15:13:58 crc kubenswrapper[4763]: I1006 15:13:58.355669 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 15:13:59 crc kubenswrapper[4763]: I1006 15:13:59.065655 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0c33271c-af2f-43e4-adaf-9a81ef747ee5","Type":"ContainerStarted","Data":"b5a74a7863d471d5ebd8edc913311b4b8759f4a34d7919b70ef31a57831def5c"} Oct 06 15:13:59 crc kubenswrapper[4763]: I1006 15:13:59.065987 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 15:13:59 crc kubenswrapper[4763]: I1006 15:13:59.084111 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.723366097 podStartE2EDuration="2.084092868s" podCreationTimestamp="2025-10-06 15:13:57 +0000 UTC" firstStartedPulling="2025-10-06 15:13:58.002589873 +0000 UTC m=+1235.157882385" lastFinishedPulling="2025-10-06 15:13:58.363316644 +0000 UTC m=+1235.518609156" observedRunningTime="2025-10-06 15:13:59.081018023 +0000 UTC m=+1236.236310535" watchObservedRunningTime="2025-10-06 15:13:59.084092868 +0000 UTC m=+1236.239385380" Oct 06 15:13:59 crc kubenswrapper[4763]: I1006 15:13:59.330839 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 15:13:59 crc kubenswrapper[4763]: I1006 15:13:59.330866 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 15:13:59 crc kubenswrapper[4763]: I1006 15:13:59.437987 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0624b1fc-a666-447b-af25-a231087b13ef" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 15:13:59 crc kubenswrapper[4763]: I1006 15:13:59.437991 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0624b1fc-a666-447b-af25-a231087b13ef" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 15:13:59 crc kubenswrapper[4763]: I1006 15:13:59.973226 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.063117 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrdpm\" (UniqueName: \"kubernetes.io/projected/d17ae0bf-7064-4085-9b58-069a7c56ed7d-kube-api-access-rrdpm\") pod \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.063268 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-combined-ca-bundle\") pod \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.063348 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-config-data\") pod \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.063467 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-log-httpd\") pod \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.063522 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-scripts\") pod \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.063681 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-run-httpd\") pod \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.063753 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-sg-core-conf-yaml\") pod \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\" (UID: \"d17ae0bf-7064-4085-9b58-069a7c56ed7d\") " Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.064467 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d17ae0bf-7064-4085-9b58-069a7c56ed7d" (UID: "d17ae0bf-7064-4085-9b58-069a7c56ed7d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.065825 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d17ae0bf-7064-4085-9b58-069a7c56ed7d" (UID: "d17ae0bf-7064-4085-9b58-069a7c56ed7d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.068586 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17ae0bf-7064-4085-9b58-069a7c56ed7d-kube-api-access-rrdpm" (OuterVolumeSpecName: "kube-api-access-rrdpm") pod "d17ae0bf-7064-4085-9b58-069a7c56ed7d" (UID: "d17ae0bf-7064-4085-9b58-069a7c56ed7d"). InnerVolumeSpecName "kube-api-access-rrdpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.071058 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-scripts" (OuterVolumeSpecName: "scripts") pod "d17ae0bf-7064-4085-9b58-069a7c56ed7d" (UID: "d17ae0bf-7064-4085-9b58-069a7c56ed7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.090323 4763 generic.go:334] "Generic (PLEG): container finished" podID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerID="df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792" exitCode=0 Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.091097 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.091517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d17ae0bf-7064-4085-9b58-069a7c56ed7d","Type":"ContainerDied","Data":"df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792"} Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.091544 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d17ae0bf-7064-4085-9b58-069a7c56ed7d","Type":"ContainerDied","Data":"3ad92d0f4b35d31ad834491c0bb8d4cec80a3e4bd6642644912f37218c397c53"} Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.091559 4763 scope.go:117] "RemoveContainer" containerID="f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.142993 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d17ae0bf-7064-4085-9b58-069a7c56ed7d" (UID: "d17ae0bf-7064-4085-9b58-069a7c56ed7d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.151459 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d17ae0bf-7064-4085-9b58-069a7c56ed7d" (UID: "d17ae0bf-7064-4085-9b58-069a7c56ed7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.165714 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.165743 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.165753 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.165763 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrdpm\" (UniqueName: \"kubernetes.io/projected/d17ae0bf-7064-4085-9b58-069a7c56ed7d-kube-api-access-rrdpm\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.165772 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.165781 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d17ae0bf-7064-4085-9b58-069a7c56ed7d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.175117 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-config-data" (OuterVolumeSpecName: "config-data") pod "d17ae0bf-7064-4085-9b58-069a7c56ed7d" (UID: "d17ae0bf-7064-4085-9b58-069a7c56ed7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.177268 4763 scope.go:117] "RemoveContainer" containerID="3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.197590 4763 scope.go:117] "RemoveContainer" containerID="df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.231745 4763 scope.go:117] "RemoveContainer" containerID="00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.267982 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d17ae0bf-7064-4085-9b58-069a7c56ed7d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.273670 4763 scope.go:117] "RemoveContainer" containerID="f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898" Oct 06 15:14:00 crc kubenswrapper[4763]: E1006 15:14:00.274296 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898\": container with ID starting with f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898 not found: ID does not exist" containerID="f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.274326 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898"} err="failed to get container status \"f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898\": rpc error: code = NotFound desc = could not find container \"f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898\": container with ID starting with f1bbe32c7015d99beb8103773acd91c056c040df1934b2b3ea481c2a6ddc9898 not found: ID does not exist" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.274346 4763 scope.go:117] "RemoveContainer" containerID="3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8" Oct 06 15:14:00 crc kubenswrapper[4763]: E1006 15:14:00.274772 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8\": container with ID starting with 3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8 not found: ID does not exist" containerID="3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.274802 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8"} err="failed to get container status \"3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8\": rpc error: code = NotFound desc = could not find container \"3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8\": container with ID starting with 3efd9a4703ad5349d05b9c8875b474f91897705bf473f162e7bf52cdf70175f8 not found: ID does not exist" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.274832 4763 scope.go:117] "RemoveContainer" containerID="df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792" Oct 06 15:14:00 crc kubenswrapper[4763]: E1006 15:14:00.275063 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792\": container with ID starting with df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792 not found: ID does not exist" containerID="df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.275086 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792"} err="failed to get container status \"df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792\": rpc error: code = NotFound desc = could not find container \"df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792\": container with ID starting with df05b8733fd8590f4072753122e85afb6426aba98197139d8992d9cc1aef2792 not found: ID does not exist" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.275100 4763 scope.go:117] "RemoveContainer" containerID="00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be" Oct 06 15:14:00 crc kubenswrapper[4763]: E1006 15:14:00.275400 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be\": container with ID starting with 00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be not found: ID does not exist" containerID="00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.275438 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be"} err="failed to get container status \"00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be\": rpc error: code = NotFound desc = could not find container \"00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be\": container with ID starting with 00f485e5202e634d0e69e0878b1e6cbc93a4397b56013fbcaed8132f2884e9be not found: ID does not exist" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.337912 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.365771 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.450805 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.473494 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.485287 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:00 crc kubenswrapper[4763]: E1006 15:14:00.485866 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="ceilometer-notification-agent" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.485889 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="ceilometer-notification-agent" Oct 06 15:14:00 crc kubenswrapper[4763]: E1006 15:14:00.485920 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="proxy-httpd" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.485930 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="proxy-httpd" Oct 06 15:14:00 crc kubenswrapper[4763]: E1006 15:14:00.485941 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="ceilometer-central-agent" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.485951 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="ceilometer-central-agent" Oct 06 15:14:00 crc kubenswrapper[4763]: E1006 15:14:00.485973 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="sg-core" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.485981 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="sg-core" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.486204 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="sg-core" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.486231 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="ceilometer-central-agent" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.486246 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="ceilometer-notification-agent" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.486268 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" containerName="proxy-httpd" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.488495 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.493003 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.493340 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.493515 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.499221 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.588607 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.588679 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.588712 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh68q\" (UniqueName: \"kubernetes.io/projected/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-kube-api-access-rh68q\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.588795 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.588838 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.588870 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-scripts\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.588892 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-config-data\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.588975 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.690952 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.691026 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.691066 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh68q\" (UniqueName: \"kubernetes.io/projected/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-kube-api-access-rh68q\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.691153 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.691195 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.691225 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-scripts\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.691250 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-config-data\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.691334 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.691476 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.691837 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.696960 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.697142 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.698454 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.703230 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-scripts\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.705405 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-config-data\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.709094 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh68q\" (UniqueName: \"kubernetes.io/projected/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-kube-api-access-rh68q\") pod \"ceilometer-0\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " pod="openstack/ceilometer-0" Oct 06 15:14:00 crc kubenswrapper[4763]: I1006 15:14:00.810149 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:01 crc kubenswrapper[4763]: I1006 15:14:01.129364 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 15:14:01 crc kubenswrapper[4763]: I1006 15:14:01.301574 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:01 crc kubenswrapper[4763]: W1006 15:14:01.302967 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff9bb524_abff_439d_ae56_0fb5f6d5f0bd.slice/crio-a297750e7e715bec371683574a64474c938eecea4489a0e61953c65b338cb3a6 WatchSource:0}: Error finding container a297750e7e715bec371683574a64474c938eecea4489a0e61953c65b338cb3a6: Status 404 returned error can't find the container with id a297750e7e715bec371683574a64474c938eecea4489a0e61953c65b338cb3a6 Oct 06 15:14:01 crc kubenswrapper[4763]: I1006 15:14:01.584453 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17ae0bf-7064-4085-9b58-069a7c56ed7d" path="/var/lib/kubelet/pods/d17ae0bf-7064-4085-9b58-069a7c56ed7d/volumes" Oct 06 15:14:02 crc kubenswrapper[4763]: I1006 15:14:02.119433 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd","Type":"ContainerStarted","Data":"a297750e7e715bec371683574a64474c938eecea4489a0e61953c65b338cb3a6"} Oct 06 15:14:03 crc kubenswrapper[4763]: I1006 15:14:03.127754 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd","Type":"ContainerStarted","Data":"26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36"} Oct 06 15:14:03 crc kubenswrapper[4763]: I1006 15:14:03.876231 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:14:03 crc kubenswrapper[4763]: I1006 15:14:03.876779 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:14:04 crc kubenswrapper[4763]: I1006 15:14:04.146189 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd","Type":"ContainerStarted","Data":"e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977"} Oct 06 15:14:04 crc kubenswrapper[4763]: I1006 15:14:04.146240 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd","Type":"ContainerStarted","Data":"20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b"} Oct 06 15:14:06 crc kubenswrapper[4763]: I1006 15:14:06.170917 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd","Type":"ContainerStarted","Data":"105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a"} Oct 06 15:14:06 crc kubenswrapper[4763]: I1006 15:14:06.171483 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 15:14:06 crc kubenswrapper[4763]: I1006 15:14:06.216940 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.033538371 podStartE2EDuration="6.21691254s" podCreationTimestamp="2025-10-06 15:14:00 +0000 UTC" firstStartedPulling="2025-10-06 15:14:01.304704165 +0000 UTC m=+1238.459996677" lastFinishedPulling="2025-10-06 15:14:05.488078324 +0000 UTC m=+1242.643370846" observedRunningTime="2025-10-06 15:14:06.199891411 +0000 UTC m=+1243.355183963" watchObservedRunningTime="2025-10-06 15:14:06.21691254 +0000 UTC m=+1243.372205092" Oct 06 15:14:07 crc kubenswrapper[4763]: I1006 15:14:07.443596 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 15:14:08 crc kubenswrapper[4763]: I1006 15:14:08.321716 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 15:14:08 crc kubenswrapper[4763]: I1006 15:14:08.324693 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 15:14:08 crc kubenswrapper[4763]: I1006 15:14:08.328598 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 15:14:08 crc kubenswrapper[4763]: I1006 15:14:08.359734 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 15:14:08 crc kubenswrapper[4763]: I1006 15:14:08.360367 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 15:14:08 crc kubenswrapper[4763]: I1006 15:14:08.360782 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 15:14:08 crc kubenswrapper[4763]: I1006 15:14:08.365011 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.211418 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.218657 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.222026 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.438476 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-p54rq"] Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.440885 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.463519 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-p54rq"] Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.582318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh7fk\" (UniqueName: \"kubernetes.io/projected/c65588a5-9e57-4d62-8abf-c0154251b6eb-kube-api-access-kh7fk\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.582936 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.582967 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.583067 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-config\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.583096 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.583132 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.685121 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh7fk\" (UniqueName: \"kubernetes.io/projected/c65588a5-9e57-4d62-8abf-c0154251b6eb-kube-api-access-kh7fk\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.685204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.685225 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.685274 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-config\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.685296 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.685323 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.687151 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.687203 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.687239 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.687541 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.687879 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-config\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.713780 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh7fk\" (UniqueName: \"kubernetes.io/projected/c65588a5-9e57-4d62-8abf-c0154251b6eb-kube-api-access-kh7fk\") pod \"dnsmasq-dns-59cf4bdb65-p54rq\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:09 crc kubenswrapper[4763]: I1006 15:14:09.786709 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:10 crc kubenswrapper[4763]: I1006 15:14:10.273796 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-p54rq"] Oct 06 15:14:10 crc kubenswrapper[4763]: W1006 15:14:10.280960 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc65588a5_9e57_4d62_8abf_c0154251b6eb.slice/crio-6dcf2c16f1b1c07a5bb4103238b712e3a60a85f87ce83eb7109b7b986905ffae WatchSource:0}: Error finding container 6dcf2c16f1b1c07a5bb4103238b712e3a60a85f87ce83eb7109b7b986905ffae: Status 404 returned error can't find the container with id 6dcf2c16f1b1c07a5bb4103238b712e3a60a85f87ce83eb7109b7b986905ffae Oct 06 15:14:11 crc kubenswrapper[4763]: I1006 15:14:11.228531 4763 generic.go:334] "Generic (PLEG): container finished" podID="c65588a5-9e57-4d62-8abf-c0154251b6eb" containerID="75069cc9ad83a59a931dc57aa44f9682f57ca621f24924238094ae0af4cdfe13" exitCode=0 Oct 06 15:14:11 crc kubenswrapper[4763]: I1006 15:14:11.229461 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" event={"ID":"c65588a5-9e57-4d62-8abf-c0154251b6eb","Type":"ContainerDied","Data":"75069cc9ad83a59a931dc57aa44f9682f57ca621f24924238094ae0af4cdfe13"} Oct 06 15:14:11 crc kubenswrapper[4763]: I1006 15:14:11.229558 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" event={"ID":"c65588a5-9e57-4d62-8abf-c0154251b6eb","Type":"ContainerStarted","Data":"6dcf2c16f1b1c07a5bb4103238b712e3a60a85f87ce83eb7109b7b986905ffae"} Oct 06 15:14:11 crc kubenswrapper[4763]: I1006 15:14:11.782065 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:11 crc kubenswrapper[4763]: I1006 15:14:11.782520 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="ceilometer-central-agent" containerID="cri-o://26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36" gracePeriod=30 Oct 06 15:14:11 crc kubenswrapper[4763]: I1006 15:14:11.782903 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="proxy-httpd" containerID="cri-o://105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a" gracePeriod=30 Oct 06 15:14:11 crc kubenswrapper[4763]: I1006 15:14:11.782954 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="sg-core" containerID="cri-o://e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977" gracePeriod=30 Oct 06 15:14:11 crc kubenswrapper[4763]: I1006 15:14:11.782986 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="ceilometer-notification-agent" containerID="cri-o://20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b" gracePeriod=30 Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.018547 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.179053 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.240932 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-config-data\") pod \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.241240 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjkxk\" (UniqueName: \"kubernetes.io/projected/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-kube-api-access-xjkxk\") pod \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.241284 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-combined-ca-bundle\") pod \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\" (UID: \"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.245782 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-kube-api-access-xjkxk" (OuterVolumeSpecName: "kube-api-access-xjkxk") pod "5b1f6ac6-57fb-4897-a563-5823c8ed8d3e" (UID: "5b1f6ac6-57fb-4897-a563-5823c8ed8d3e"). InnerVolumeSpecName "kube-api-access-xjkxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.248009 4763 generic.go:334] "Generic (PLEG): container finished" podID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerID="105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a" exitCode=0 Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.248036 4763 generic.go:334] "Generic (PLEG): container finished" podID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerID="e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977" exitCode=2 Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.248045 4763 generic.go:334] "Generic (PLEG): container finished" podID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerID="26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36" exitCode=0 Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.248122 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd","Type":"ContainerDied","Data":"105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a"} Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.248151 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd","Type":"ContainerDied","Data":"e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977"} Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.248162 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd","Type":"ContainerDied","Data":"26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36"} Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.250197 4763 generic.go:334] "Generic (PLEG): container finished" podID="5b1f6ac6-57fb-4897-a563-5823c8ed8d3e" containerID="02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a" exitCode=137 Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.250280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e","Type":"ContainerDied","Data":"02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a"} Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.250308 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.250328 4763 scope.go:117] "RemoveContainer" containerID="02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.250314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5b1f6ac6-57fb-4897-a563-5823c8ed8d3e","Type":"ContainerDied","Data":"bd3715936f2b8b5b8117560b41b2470b062b366e8167286ea6c309f6722d6b8a"} Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.258870 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0624b1fc-a666-447b-af25-a231087b13ef" containerName="nova-api-log" containerID="cri-o://081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4" gracePeriod=30 Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.259474 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" event={"ID":"c65588a5-9e57-4d62-8abf-c0154251b6eb","Type":"ContainerStarted","Data":"e558c915bfc1f5ce99283b28f4010dce7a4750b02b11e663509a4bc0669fd4a9"} Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.259543 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0624b1fc-a666-447b-af25-a231087b13ef" containerName="nova-api-api" containerID="cri-o://3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8" gracePeriod=30 Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.260106 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.287441 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b1f6ac6-57fb-4897-a563-5823c8ed8d3e" (UID: "5b1f6ac6-57fb-4897-a563-5823c8ed8d3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.296299 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" podStartSLOduration=3.296276809 podStartE2EDuration="3.296276809s" podCreationTimestamp="2025-10-06 15:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:14:12.28472029 +0000 UTC m=+1249.440012812" watchObservedRunningTime="2025-10-06 15:14:12.296276809 +0000 UTC m=+1249.451569321" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.300150 4763 scope.go:117] "RemoveContainer" containerID="02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a" Oct 06 15:14:12 crc kubenswrapper[4763]: E1006 15:14:12.302279 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a\": container with ID starting with 02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a not found: ID does not exist" containerID="02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.302323 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a"} err="failed to get container status \"02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a\": rpc error: code = NotFound desc = could not find container \"02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a\": container with ID starting with 02c482c8de56350a57f1ec8f4c50508b234c6d990d25a08477b0a3c1859cbd0a not found: ID does not exist" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.321672 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-config-data" (OuterVolumeSpecName: "config-data") pod "5b1f6ac6-57fb-4897-a563-5823c8ed8d3e" (UID: "5b1f6ac6-57fb-4897-a563-5823c8ed8d3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.344940 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjkxk\" (UniqueName: \"kubernetes.io/projected/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-kube-api-access-xjkxk\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.344987 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.345000 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.581416 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.588148 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.617839 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:14:12 crc kubenswrapper[4763]: E1006 15:14:12.618389 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b1f6ac6-57fb-4897-a563-5823c8ed8d3e" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.618414 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1f6ac6-57fb-4897-a563-5823c8ed8d3e" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.618673 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b1f6ac6-57fb-4897-a563-5823c8ed8d3e" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.619465 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.622247 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.622482 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.622482 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.643590 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.650374 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.650437 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.650701 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.650737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whrvl\" (UniqueName: \"kubernetes.io/projected/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-kube-api-access-whrvl\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.650782 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.752419 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.752470 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whrvl\" (UniqueName: \"kubernetes.io/projected/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-kube-api-access-whrvl\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.752507 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.752625 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.752653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.757401 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.757764 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.763945 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.763993 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.769112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whrvl\" (UniqueName: \"kubernetes.io/projected/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-kube-api-access-whrvl\") pod \"nova-cell1-novncproxy-0\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.800364 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.853435 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-sg-core-conf-yaml\") pod \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.853480 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-config-data\") pod \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.853525 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-log-httpd\") pod \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.853570 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-ceilometer-tls-certs\") pod \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.853602 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-scripts\") pod \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.853676 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-combined-ca-bundle\") pod \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.853729 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh68q\" (UniqueName: \"kubernetes.io/projected/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-kube-api-access-rh68q\") pod \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.853888 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-run-httpd\") pod \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\" (UID: \"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd\") " Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.854771 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" (UID: "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.856379 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" (UID: "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.860418 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-scripts" (OuterVolumeSpecName: "scripts") pod "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" (UID: "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.863483 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-kube-api-access-rh68q" (OuterVolumeSpecName: "kube-api-access-rh68q") pod "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" (UID: "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd"). InnerVolumeSpecName "kube-api-access-rh68q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.896157 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" (UID: "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.920398 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" (UID: "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.935769 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.956923 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.956950 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.956961 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.956968 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh68q\" (UniqueName: \"kubernetes.io/projected/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-kube-api-access-rh68q\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.956976 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.956983 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.960483 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" (UID: "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:12 crc kubenswrapper[4763]: I1006 15:14:12.973816 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-config-data" (OuterVolumeSpecName: "config-data") pod "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" (UID: "ff9bb524-abff-439d-ae56-0fb5f6d5f0bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.058631 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.058887 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.272493 4763 generic.go:334] "Generic (PLEG): container finished" podID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerID="20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b" exitCode=0 Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.272581 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd","Type":"ContainerDied","Data":"20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b"} Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.272634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff9bb524-abff-439d-ae56-0fb5f6d5f0bd","Type":"ContainerDied","Data":"a297750e7e715bec371683574a64474c938eecea4489a0e61953c65b338cb3a6"} Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.272655 4763 scope.go:117] "RemoveContainer" containerID="105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.272786 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.279663 4763 generic.go:334] "Generic (PLEG): container finished" podID="0624b1fc-a666-447b-af25-a231087b13ef" containerID="081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4" exitCode=143 Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.279719 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0624b1fc-a666-447b-af25-a231087b13ef","Type":"ContainerDied","Data":"081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4"} Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.298353 4763 scope.go:117] "RemoveContainer" containerID="e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.320118 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.331344 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.352757 4763 scope.go:117] "RemoveContainer" containerID="20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.353309 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:13 crc kubenswrapper[4763]: E1006 15:14:13.353772 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="proxy-httpd" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.353789 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="proxy-httpd" Oct 06 15:14:13 crc kubenswrapper[4763]: E1006 15:14:13.353819 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="ceilometer-notification-agent" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.353828 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="ceilometer-notification-agent" Oct 06 15:14:13 crc kubenswrapper[4763]: E1006 15:14:13.353844 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="ceilometer-central-agent" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.353850 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="ceilometer-central-agent" Oct 06 15:14:13 crc kubenswrapper[4763]: E1006 15:14:13.353866 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="sg-core" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.353873 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="sg-core" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.358018 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="ceilometer-central-agent" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.358070 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="proxy-httpd" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.358090 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="ceilometer-notification-agent" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.358110 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" containerName="sg-core" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.360786 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.365071 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.365117 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.365220 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.387340 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.400106 4763 scope.go:117] "RemoveContainer" containerID="26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.424977 4763 scope.go:117] "RemoveContainer" containerID="105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a" Oct 06 15:14:13 crc kubenswrapper[4763]: E1006 15:14:13.425473 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a\": container with ID starting with 105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a not found: ID does not exist" containerID="105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.425518 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a"} err="failed to get container status \"105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a\": rpc error: code = NotFound desc = could not find container \"105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a\": container with ID starting with 105febd570a69ffa471d0a08af482c6b37f8814dac0d164d2ea72ad5e55b150a not found: ID does not exist" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.425547 4763 scope.go:117] "RemoveContainer" containerID="e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977" Oct 06 15:14:13 crc kubenswrapper[4763]: E1006 15:14:13.426167 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977\": container with ID starting with e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977 not found: ID does not exist" containerID="e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.426206 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977"} err="failed to get container status \"e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977\": rpc error: code = NotFound desc = could not find container \"e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977\": container with ID starting with e5495748f9ab78a2930efda60a3d364fc72dbec553d75c728a1504fc04061977 not found: ID does not exist" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.426227 4763 scope.go:117] "RemoveContainer" containerID="20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b" Oct 06 15:14:13 crc kubenswrapper[4763]: E1006 15:14:13.426431 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b\": container with ID starting with 20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b not found: ID does not exist" containerID="20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.426458 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b"} err="failed to get container status \"20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b\": rpc error: code = NotFound desc = could not find container \"20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b\": container with ID starting with 20edddd2605ff2b2052aa4a2bf53dbc12ae04980f92cb4d93cc15433476dae8b not found: ID does not exist" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.426474 4763 scope.go:117] "RemoveContainer" containerID="26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36" Oct 06 15:14:13 crc kubenswrapper[4763]: E1006 15:14:13.426689 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36\": container with ID starting with 26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36 not found: ID does not exist" containerID="26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.426710 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36"} err="failed to get container status \"26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36\": rpc error: code = NotFound desc = could not find container \"26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36\": container with ID starting with 26626870573acb65815621117aa07db8a924296eafb0fe884fbe38a93734ec36 not found: ID does not exist" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.427705 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:14:13 crc kubenswrapper[4763]: W1006 15:14:13.430307 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod447e0e13_620c_40bb_b13c_5f9e7d5bba4a.slice/crio-c83072254b881ffc43761eb1e090e9f3dc42f88169433338fbe36a374281ed7b WatchSource:0}: Error finding container c83072254b881ffc43761eb1e090e9f3dc42f88169433338fbe36a374281ed7b: Status 404 returned error can't find the container with id c83072254b881ffc43761eb1e090e9f3dc42f88169433338fbe36a374281ed7b Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.465060 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.465113 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-scripts\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.465140 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.465247 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.465292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-log-httpd\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.465336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vht5s\" (UniqueName: \"kubernetes.io/projected/a5e93881-adbd-47b9-8b09-57200d8cf3cd-kube-api-access-vht5s\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.465403 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-config-data\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.465500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-run-httpd\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.566722 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vht5s\" (UniqueName: \"kubernetes.io/projected/a5e93881-adbd-47b9-8b09-57200d8cf3cd-kube-api-access-vht5s\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.566800 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-config-data\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.566844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-run-httpd\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.566912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.566945 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-scripts\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.566971 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.566998 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.567018 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-log-httpd\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.567458 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-log-httpd\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.568294 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-run-httpd\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.572236 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.572981 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.574363 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.584301 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-scripts\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.584355 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-config-data\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.585965 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vht5s\" (UniqueName: \"kubernetes.io/projected/a5e93881-adbd-47b9-8b09-57200d8cf3cd-kube-api-access-vht5s\") pod \"ceilometer-0\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.591987 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b1f6ac6-57fb-4897-a563-5823c8ed8d3e" path="/var/lib/kubelet/pods/5b1f6ac6-57fb-4897-a563-5823c8ed8d3e/volumes" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.592809 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9bb524-abff-439d-ae56-0fb5f6d5f0bd" path="/var/lib/kubelet/pods/ff9bb524-abff-439d-ae56-0fb5f6d5f0bd/volumes" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.712940 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:13 crc kubenswrapper[4763]: I1006 15:14:13.791036 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:14 crc kubenswrapper[4763]: I1006 15:14:14.213969 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:14 crc kubenswrapper[4763]: W1006 15:14:14.225907 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5e93881_adbd_47b9_8b09_57200d8cf3cd.slice/crio-8c36d5ca95efed11d20cdb92ab07c3ffdff7ae80608059ef2143f69a6f256cd5 WatchSource:0}: Error finding container 8c36d5ca95efed11d20cdb92ab07c3ffdff7ae80608059ef2143f69a6f256cd5: Status 404 returned error can't find the container with id 8c36d5ca95efed11d20cdb92ab07c3ffdff7ae80608059ef2143f69a6f256cd5 Oct 06 15:14:14 crc kubenswrapper[4763]: I1006 15:14:14.288994 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5e93881-adbd-47b9-8b09-57200d8cf3cd","Type":"ContainerStarted","Data":"8c36d5ca95efed11d20cdb92ab07c3ffdff7ae80608059ef2143f69a6f256cd5"} Oct 06 15:14:14 crc kubenswrapper[4763]: I1006 15:14:14.293016 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"447e0e13-620c-40bb-b13c-5f9e7d5bba4a","Type":"ContainerStarted","Data":"d62b06765156a7ea72321c0cb90516a0c884d6a312dd38effde07268e0b70bd4"} Oct 06 15:14:14 crc kubenswrapper[4763]: I1006 15:14:14.293046 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"447e0e13-620c-40bb-b13c-5f9e7d5bba4a","Type":"ContainerStarted","Data":"c83072254b881ffc43761eb1e090e9f3dc42f88169433338fbe36a374281ed7b"} Oct 06 15:14:14 crc kubenswrapper[4763]: I1006 15:14:14.319401 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.319323152 podStartE2EDuration="2.319323152s" podCreationTimestamp="2025-10-06 15:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:14:14.312651038 +0000 UTC m=+1251.467943550" watchObservedRunningTime="2025-10-06 15:14:14.319323152 +0000 UTC m=+1251.474615664" Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.304143 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5e93881-adbd-47b9-8b09-57200d8cf3cd","Type":"ContainerStarted","Data":"8e7399b90632bd39a274e250b00fb11c66e76b11817591338bd5f4c8b9c02d07"} Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.783437 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.813441 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0624b1fc-a666-447b-af25-a231087b13ef-kube-api-access-2h27n" (OuterVolumeSpecName: "kube-api-access-2h27n") pod "0624b1fc-a666-447b-af25-a231087b13ef" (UID: "0624b1fc-a666-447b-af25-a231087b13ef"). InnerVolumeSpecName "kube-api-access-2h27n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.809002 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h27n\" (UniqueName: \"kubernetes.io/projected/0624b1fc-a666-447b-af25-a231087b13ef-kube-api-access-2h27n\") pod \"0624b1fc-a666-447b-af25-a231087b13ef\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.815167 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0624b1fc-a666-447b-af25-a231087b13ef-logs\") pod \"0624b1fc-a666-447b-af25-a231087b13ef\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.815321 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-config-data\") pod \"0624b1fc-a666-447b-af25-a231087b13ef\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.815395 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-combined-ca-bundle\") pod \"0624b1fc-a666-447b-af25-a231087b13ef\" (UID: \"0624b1fc-a666-447b-af25-a231087b13ef\") " Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.815633 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0624b1fc-a666-447b-af25-a231087b13ef-logs" (OuterVolumeSpecName: "logs") pod "0624b1fc-a666-447b-af25-a231087b13ef" (UID: "0624b1fc-a666-447b-af25-a231087b13ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.816125 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h27n\" (UniqueName: \"kubernetes.io/projected/0624b1fc-a666-447b-af25-a231087b13ef-kube-api-access-2h27n\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.816153 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0624b1fc-a666-447b-af25-a231087b13ef-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.844942 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0624b1fc-a666-447b-af25-a231087b13ef" (UID: "0624b1fc-a666-447b-af25-a231087b13ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.863293 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-config-data" (OuterVolumeSpecName: "config-data") pod "0624b1fc-a666-447b-af25-a231087b13ef" (UID: "0624b1fc-a666-447b-af25-a231087b13ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.917890 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:15 crc kubenswrapper[4763]: I1006 15:14:15.917925 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0624b1fc-a666-447b-af25-a231087b13ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.329992 4763 generic.go:334] "Generic (PLEG): container finished" podID="0624b1fc-a666-447b-af25-a231087b13ef" containerID="3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8" exitCode=0 Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.330482 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.330933 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0624b1fc-a666-447b-af25-a231087b13ef","Type":"ContainerDied","Data":"3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8"} Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.330980 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0624b1fc-a666-447b-af25-a231087b13ef","Type":"ContainerDied","Data":"b4b9345093a0a28cf9c98eec456995bfa3c9196567cc20220170801d2b528065"} Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.331000 4763 scope.go:117] "RemoveContainer" containerID="3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.344163 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5e93881-adbd-47b9-8b09-57200d8cf3cd","Type":"ContainerStarted","Data":"deaf16e9a6bba32311802bc8ac7be05ac2d7dfec09b5710301b0dd84ed4bcbae"} Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.388921 4763 scope.go:117] "RemoveContainer" containerID="081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.409947 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.410314 4763 scope.go:117] "RemoveContainer" containerID="3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8" Oct 06 15:14:16 crc kubenswrapper[4763]: E1006 15:14:16.412119 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8\": container with ID starting with 3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8 not found: ID does not exist" containerID="3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.412174 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8"} err="failed to get container status \"3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8\": rpc error: code = NotFound desc = could not find container \"3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8\": container with ID starting with 3b2639a27120452dbb7f566e96b845b8cb48c52da3e49f75d9f48d19d4ba0ce8 not found: ID does not exist" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.412195 4763 scope.go:117] "RemoveContainer" containerID="081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4" Oct 06 15:14:16 crc kubenswrapper[4763]: E1006 15:14:16.412445 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4\": container with ID starting with 081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4 not found: ID does not exist" containerID="081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.412459 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4"} err="failed to get container status \"081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4\": rpc error: code = NotFound desc = could not find container \"081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4\": container with ID starting with 081cfa4f62fc448c6c9ce5615fad314461078feb9fe979d8580f096292aa72e4 not found: ID does not exist" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.418379 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.433096 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:16 crc kubenswrapper[4763]: E1006 15:14:16.433753 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0624b1fc-a666-447b-af25-a231087b13ef" containerName="nova-api-log" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.433854 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0624b1fc-a666-447b-af25-a231087b13ef" containerName="nova-api-log" Oct 06 15:14:16 crc kubenswrapper[4763]: E1006 15:14:16.433966 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0624b1fc-a666-447b-af25-a231087b13ef" containerName="nova-api-api" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.434052 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0624b1fc-a666-447b-af25-a231087b13ef" containerName="nova-api-api" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.437117 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0624b1fc-a666-447b-af25-a231087b13ef" containerName="nova-api-log" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.437338 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0624b1fc-a666-447b-af25-a231087b13ef" containerName="nova-api-api" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.443396 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.444972 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.453524 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.453761 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.454098 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.528686 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6wl\" (UniqueName: \"kubernetes.io/projected/7feebe60-edd5-41a6-86da-be1127438714-kube-api-access-8x6wl\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.528806 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-config-data\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.528855 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-public-tls-certs\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.528884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.528905 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.528933 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7feebe60-edd5-41a6-86da-be1127438714-logs\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.630353 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6wl\" (UniqueName: \"kubernetes.io/projected/7feebe60-edd5-41a6-86da-be1127438714-kube-api-access-8x6wl\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.630813 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-config-data\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.630883 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-public-tls-certs\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.630917 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.630960 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.630992 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7feebe60-edd5-41a6-86da-be1127438714-logs\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.633640 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7feebe60-edd5-41a6-86da-be1127438714-logs\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.645160 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-public-tls-certs\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.649760 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.651468 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-config-data\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.652431 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.657151 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6wl\" (UniqueName: \"kubernetes.io/projected/7feebe60-edd5-41a6-86da-be1127438714-kube-api-access-8x6wl\") pod \"nova-api-0\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " pod="openstack/nova-api-0" Oct 06 15:14:16 crc kubenswrapper[4763]: I1006 15:14:16.767202 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:14:17 crc kubenswrapper[4763]: I1006 15:14:17.237307 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:17 crc kubenswrapper[4763]: I1006 15:14:17.359311 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7feebe60-edd5-41a6-86da-be1127438714","Type":"ContainerStarted","Data":"301c00f086065495e3029a82080d6fccfe9eb8b656c2deef5eb80e9a5ef4d125"} Oct 06 15:14:17 crc kubenswrapper[4763]: I1006 15:14:17.361595 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5e93881-adbd-47b9-8b09-57200d8cf3cd","Type":"ContainerStarted","Data":"8f9fa45802b5866c269731e2906082d4842c327bd8c79f3b9582e38e7c137308"} Oct 06 15:14:17 crc kubenswrapper[4763]: I1006 15:14:17.587754 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0624b1fc-a666-447b-af25-a231087b13ef" path="/var/lib/kubelet/pods/0624b1fc-a666-447b-af25-a231087b13ef/volumes" Oct 06 15:14:17 crc kubenswrapper[4763]: I1006 15:14:17.936667 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:18 crc kubenswrapper[4763]: I1006 15:14:18.378266 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7feebe60-edd5-41a6-86da-be1127438714","Type":"ContainerStarted","Data":"da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301"} Oct 06 15:14:18 crc kubenswrapper[4763]: I1006 15:14:18.378328 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7feebe60-edd5-41a6-86da-be1127438714","Type":"ContainerStarted","Data":"7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835"} Oct 06 15:14:18 crc kubenswrapper[4763]: I1006 15:14:18.381407 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5e93881-adbd-47b9-8b09-57200d8cf3cd","Type":"ContainerStarted","Data":"8e50b0ac4a9dedb663c099da2c6a2741fda0645418d4e072a06dc4ea29e2246c"} Oct 06 15:14:18 crc kubenswrapper[4763]: I1006 15:14:18.381643 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="ceilometer-central-agent" containerID="cri-o://8e7399b90632bd39a274e250b00fb11c66e76b11817591338bd5f4c8b9c02d07" gracePeriod=30 Oct 06 15:14:18 crc kubenswrapper[4763]: I1006 15:14:18.381683 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="sg-core" containerID="cri-o://8f9fa45802b5866c269731e2906082d4842c327bd8c79f3b9582e38e7c137308" gracePeriod=30 Oct 06 15:14:18 crc kubenswrapper[4763]: I1006 15:14:18.381713 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="ceilometer-notification-agent" containerID="cri-o://deaf16e9a6bba32311802bc8ac7be05ac2d7dfec09b5710301b0dd84ed4bcbae" gracePeriod=30 Oct 06 15:14:18 crc kubenswrapper[4763]: I1006 15:14:18.381729 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 15:14:18 crc kubenswrapper[4763]: I1006 15:14:18.381727 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="proxy-httpd" containerID="cri-o://8e50b0ac4a9dedb663c099da2c6a2741fda0645418d4e072a06dc4ea29e2246c" gracePeriod=30 Oct 06 15:14:18 crc kubenswrapper[4763]: I1006 15:14:18.417832 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.417804369 podStartE2EDuration="2.417804369s" podCreationTimestamp="2025-10-06 15:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:14:18.402674112 +0000 UTC m=+1255.557966644" watchObservedRunningTime="2025-10-06 15:14:18.417804369 +0000 UTC m=+1255.573096911" Oct 06 15:14:18 crc kubenswrapper[4763]: I1006 15:14:18.459525 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9054755939999999 podStartE2EDuration="5.459495178s" podCreationTimestamp="2025-10-06 15:14:13 +0000 UTC" firstStartedPulling="2025-10-06 15:14:14.236027366 +0000 UTC m=+1251.391319878" lastFinishedPulling="2025-10-06 15:14:17.79004695 +0000 UTC m=+1254.945339462" observedRunningTime="2025-10-06 15:14:18.443887008 +0000 UTC m=+1255.599179560" watchObservedRunningTime="2025-10-06 15:14:18.459495178 +0000 UTC m=+1255.614787730" Oct 06 15:14:19 crc kubenswrapper[4763]: I1006 15:14:19.393602 4763 generic.go:334] "Generic (PLEG): container finished" podID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerID="8e50b0ac4a9dedb663c099da2c6a2741fda0645418d4e072a06dc4ea29e2246c" exitCode=0 Oct 06 15:14:19 crc kubenswrapper[4763]: I1006 15:14:19.394477 4763 generic.go:334] "Generic (PLEG): container finished" podID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerID="8f9fa45802b5866c269731e2906082d4842c327bd8c79f3b9582e38e7c137308" exitCode=2 Oct 06 15:14:19 crc kubenswrapper[4763]: I1006 15:14:19.394552 4763 generic.go:334] "Generic (PLEG): container finished" podID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerID="deaf16e9a6bba32311802bc8ac7be05ac2d7dfec09b5710301b0dd84ed4bcbae" exitCode=0 Oct 06 15:14:19 crc kubenswrapper[4763]: I1006 15:14:19.393643 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5e93881-adbd-47b9-8b09-57200d8cf3cd","Type":"ContainerDied","Data":"8e50b0ac4a9dedb663c099da2c6a2741fda0645418d4e072a06dc4ea29e2246c"} Oct 06 15:14:19 crc kubenswrapper[4763]: I1006 15:14:19.394670 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5e93881-adbd-47b9-8b09-57200d8cf3cd","Type":"ContainerDied","Data":"8f9fa45802b5866c269731e2906082d4842c327bd8c79f3b9582e38e7c137308"} Oct 06 15:14:19 crc kubenswrapper[4763]: I1006 15:14:19.394692 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5e93881-adbd-47b9-8b09-57200d8cf3cd","Type":"ContainerDied","Data":"deaf16e9a6bba32311802bc8ac7be05ac2d7dfec09b5710301b0dd84ed4bcbae"} Oct 06 15:14:19 crc kubenswrapper[4763]: I1006 15:14:19.788900 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:14:19 crc kubenswrapper[4763]: I1006 15:14:19.879122 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-b9vb2"] Oct 06 15:14:19 crc kubenswrapper[4763]: I1006 15:14:19.879403 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" podUID="86597dc8-a99b-4220-a89d-eb7e8a117546" containerName="dnsmasq-dns" containerID="cri-o://ccd62700fafdf3a6f1684e5b5db825da9d723dfeda19d35346fbeaad12324951" gracePeriod=10 Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.409631 4763 generic.go:334] "Generic (PLEG): container finished" podID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerID="8e7399b90632bd39a274e250b00fb11c66e76b11817591338bd5f4c8b9c02d07" exitCode=0 Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.409658 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5e93881-adbd-47b9-8b09-57200d8cf3cd","Type":"ContainerDied","Data":"8e7399b90632bd39a274e250b00fb11c66e76b11817591338bd5f4c8b9c02d07"} Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.413355 4763 generic.go:334] "Generic (PLEG): container finished" podID="86597dc8-a99b-4220-a89d-eb7e8a117546" containerID="ccd62700fafdf3a6f1684e5b5db825da9d723dfeda19d35346fbeaad12324951" exitCode=0 Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.413396 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" event={"ID":"86597dc8-a99b-4220-a89d-eb7e8a117546","Type":"ContainerDied","Data":"ccd62700fafdf3a6f1684e5b5db825da9d723dfeda19d35346fbeaad12324951"} Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.413440 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" event={"ID":"86597dc8-a99b-4220-a89d-eb7e8a117546","Type":"ContainerDied","Data":"cc003ae6ae2cc56cb50f4f2c9fc78faca4e39c95b9627d449a7bd1965f7fb66f"} Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.413454 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc003ae6ae2cc56cb50f4f2c9fc78faca4e39c95b9627d449a7bd1965f7fb66f" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.509060 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.686568 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.708120 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-swift-storage-0\") pod \"86597dc8-a99b-4220-a89d-eb7e8a117546\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.708167 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5snz\" (UniqueName: \"kubernetes.io/projected/86597dc8-a99b-4220-a89d-eb7e8a117546-kube-api-access-r5snz\") pod \"86597dc8-a99b-4220-a89d-eb7e8a117546\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.708212 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-sb\") pod \"86597dc8-a99b-4220-a89d-eb7e8a117546\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.708464 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-nb\") pod \"86597dc8-a99b-4220-a89d-eb7e8a117546\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.708543 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-config\") pod \"86597dc8-a99b-4220-a89d-eb7e8a117546\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.708564 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-svc\") pod \"86597dc8-a99b-4220-a89d-eb7e8a117546\" (UID: \"86597dc8-a99b-4220-a89d-eb7e8a117546\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.716219 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86597dc8-a99b-4220-a89d-eb7e8a117546-kube-api-access-r5snz" (OuterVolumeSpecName: "kube-api-access-r5snz") pod "86597dc8-a99b-4220-a89d-eb7e8a117546" (UID: "86597dc8-a99b-4220-a89d-eb7e8a117546"). InnerVolumeSpecName "kube-api-access-r5snz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.756841 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "86597dc8-a99b-4220-a89d-eb7e8a117546" (UID: "86597dc8-a99b-4220-a89d-eb7e8a117546"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.759529 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-config" (OuterVolumeSpecName: "config") pod "86597dc8-a99b-4220-a89d-eb7e8a117546" (UID: "86597dc8-a99b-4220-a89d-eb7e8a117546"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.761520 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86597dc8-a99b-4220-a89d-eb7e8a117546" (UID: "86597dc8-a99b-4220-a89d-eb7e8a117546"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.770441 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "86597dc8-a99b-4220-a89d-eb7e8a117546" (UID: "86597dc8-a99b-4220-a89d-eb7e8a117546"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.771306 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "86597dc8-a99b-4220-a89d-eb7e8a117546" (UID: "86597dc8-a99b-4220-a89d-eb7e8a117546"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.809905 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-combined-ca-bundle\") pod \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.810018 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-config-data\") pod \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.810095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-ceilometer-tls-certs\") pod \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.810392 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-log-httpd\") pod \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.810419 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-run-httpd\") pod \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.810461 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vht5s\" (UniqueName: \"kubernetes.io/projected/a5e93881-adbd-47b9-8b09-57200d8cf3cd-kube-api-access-vht5s\") pod \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.811116 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-scripts\") pod \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.811184 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-sg-core-conf-yaml\") pod \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\" (UID: \"a5e93881-adbd-47b9-8b09-57200d8cf3cd\") " Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.811226 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a5e93881-adbd-47b9-8b09-57200d8cf3cd" (UID: "a5e93881-adbd-47b9-8b09-57200d8cf3cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.811397 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a5e93881-adbd-47b9-8b09-57200d8cf3cd" (UID: "a5e93881-adbd-47b9-8b09-57200d8cf3cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.812018 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.812086 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.812102 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5e93881-adbd-47b9-8b09-57200d8cf3cd-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.812116 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.812127 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.812138 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.812151 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5snz\" (UniqueName: \"kubernetes.io/projected/86597dc8-a99b-4220-a89d-eb7e8a117546-kube-api-access-r5snz\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.812163 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86597dc8-a99b-4220-a89d-eb7e8a117546-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.814253 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-scripts" (OuterVolumeSpecName: "scripts") pod "a5e93881-adbd-47b9-8b09-57200d8cf3cd" (UID: "a5e93881-adbd-47b9-8b09-57200d8cf3cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.814751 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e93881-adbd-47b9-8b09-57200d8cf3cd-kube-api-access-vht5s" (OuterVolumeSpecName: "kube-api-access-vht5s") pod "a5e93881-adbd-47b9-8b09-57200d8cf3cd" (UID: "a5e93881-adbd-47b9-8b09-57200d8cf3cd"). InnerVolumeSpecName "kube-api-access-vht5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.847712 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a5e93881-adbd-47b9-8b09-57200d8cf3cd" (UID: "a5e93881-adbd-47b9-8b09-57200d8cf3cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.859183 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a5e93881-adbd-47b9-8b09-57200d8cf3cd" (UID: "a5e93881-adbd-47b9-8b09-57200d8cf3cd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.898652 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-config-data" (OuterVolumeSpecName: "config-data") pod "a5e93881-adbd-47b9-8b09-57200d8cf3cd" (UID: "a5e93881-adbd-47b9-8b09-57200d8cf3cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.910755 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5e93881-adbd-47b9-8b09-57200d8cf3cd" (UID: "a5e93881-adbd-47b9-8b09-57200d8cf3cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.913754 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.913779 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.913790 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vht5s\" (UniqueName: \"kubernetes.io/projected/a5e93881-adbd-47b9-8b09-57200d8cf3cd-kube-api-access-vht5s\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.913798 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.913806 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:20 crc kubenswrapper[4763]: I1006 15:14:20.913815 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e93881-adbd-47b9-8b09-57200d8cf3cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.427080 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-b9vb2" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.427088 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5e93881-adbd-47b9-8b09-57200d8cf3cd","Type":"ContainerDied","Data":"8c36d5ca95efed11d20cdb92ab07c3ffdff7ae80608059ef2143f69a6f256cd5"} Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.427139 4763 scope.go:117] "RemoveContainer" containerID="8e50b0ac4a9dedb663c099da2c6a2741fda0645418d4e072a06dc4ea29e2246c" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.427104 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.478177 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-b9vb2"] Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.483849 4763 scope.go:117] "RemoveContainer" containerID="8f9fa45802b5866c269731e2906082d4842c327bd8c79f3b9582e38e7c137308" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.490199 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-b9vb2"] Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.499264 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.508281 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.517780 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:21 crc kubenswrapper[4763]: E1006 15:14:21.518243 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="sg-core" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518272 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="sg-core" Oct 06 15:14:21 crc kubenswrapper[4763]: E1006 15:14:21.518296 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="proxy-httpd" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518304 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="proxy-httpd" Oct 06 15:14:21 crc kubenswrapper[4763]: E1006 15:14:21.518321 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="ceilometer-notification-agent" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518331 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="ceilometer-notification-agent" Oct 06 15:14:21 crc kubenswrapper[4763]: E1006 15:14:21.518341 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86597dc8-a99b-4220-a89d-eb7e8a117546" containerName="dnsmasq-dns" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518347 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="86597dc8-a99b-4220-a89d-eb7e8a117546" containerName="dnsmasq-dns" Oct 06 15:14:21 crc kubenswrapper[4763]: E1006 15:14:21.518384 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86597dc8-a99b-4220-a89d-eb7e8a117546" containerName="init" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518392 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="86597dc8-a99b-4220-a89d-eb7e8a117546" containerName="init" Oct 06 15:14:21 crc kubenswrapper[4763]: E1006 15:14:21.518418 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="ceilometer-central-agent" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518425 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="ceilometer-central-agent" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518586 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="86597dc8-a99b-4220-a89d-eb7e8a117546" containerName="dnsmasq-dns" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518626 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="proxy-httpd" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518636 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="ceilometer-central-agent" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518645 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="ceilometer-notification-agent" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.518655 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" containerName="sg-core" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.520356 4763 scope.go:117] "RemoveContainer" containerID="deaf16e9a6bba32311802bc8ac7be05ac2d7dfec09b5710301b0dd84ed4bcbae" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.520870 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.522569 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.529410 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.529530 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.553056 4763 scope.go:117] "RemoveContainer" containerID="8e7399b90632bd39a274e250b00fb11c66e76b11817591338bd5f4c8b9c02d07" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.557644 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.602406 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86597dc8-a99b-4220-a89d-eb7e8a117546" path="/var/lib/kubelet/pods/86597dc8-a99b-4220-a89d-eb7e8a117546/volumes" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.603821 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e93881-adbd-47b9-8b09-57200d8cf3cd" path="/var/lib/kubelet/pods/a5e93881-adbd-47b9-8b09-57200d8cf3cd/volumes" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.624170 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztb8l\" (UniqueName: \"kubernetes.io/projected/78411411-8959-4af9-9396-864a5dc9f0b1-kube-api-access-ztb8l\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.624252 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-config-data\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.624336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-log-httpd\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.624368 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.624388 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.624446 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-run-httpd\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.624484 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.624529 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-scripts\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.725709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-log-httpd\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.725771 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.725795 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.725874 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-run-httpd\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.725928 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.725974 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-scripts\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.726004 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztb8l\" (UniqueName: \"kubernetes.io/projected/78411411-8959-4af9-9396-864a5dc9f0b1-kube-api-access-ztb8l\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.726051 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-config-data\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.726163 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-log-httpd\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.726938 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-run-httpd\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.731292 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.731409 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-scripts\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.731549 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.731580 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-config-data\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.737557 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.747083 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztb8l\" (UniqueName: \"kubernetes.io/projected/78411411-8959-4af9-9396-864a5dc9f0b1-kube-api-access-ztb8l\") pod \"ceilometer-0\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " pod="openstack/ceilometer-0" Oct 06 15:14:21 crc kubenswrapper[4763]: I1006 15:14:21.847230 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:14:22 crc kubenswrapper[4763]: I1006 15:14:22.281178 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:14:22 crc kubenswrapper[4763]: W1006 15:14:22.288349 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78411411_8959_4af9_9396_864a5dc9f0b1.slice/crio-f799d8b5debb8278650ae1752c19192f7332b2feea64dba5b2b7c380b0cbc8d7 WatchSource:0}: Error finding container f799d8b5debb8278650ae1752c19192f7332b2feea64dba5b2b7c380b0cbc8d7: Status 404 returned error can't find the container with id f799d8b5debb8278650ae1752c19192f7332b2feea64dba5b2b7c380b0cbc8d7 Oct 06 15:14:22 crc kubenswrapper[4763]: I1006 15:14:22.439067 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78411411-8959-4af9-9396-864a5dc9f0b1","Type":"ContainerStarted","Data":"f799d8b5debb8278650ae1752c19192f7332b2feea64dba5b2b7c380b0cbc8d7"} Oct 06 15:14:22 crc kubenswrapper[4763]: I1006 15:14:22.936539 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:22 crc kubenswrapper[4763]: I1006 15:14:22.966558 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.452924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78411411-8959-4af9-9396-864a5dc9f0b1","Type":"ContainerStarted","Data":"08f39e51da65908a2dcee840d82e7c6b6fa5d8d45e73728b189ebdf1b5b89b0d"} Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.477331 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.666052 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8hvd4"] Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.667493 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.670081 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.670197 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.681918 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8hvd4"] Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.767513 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-config-data\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.767573 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h466g\" (UniqueName: \"kubernetes.io/projected/4755a125-256f-47bf-8ed7-10c4662f10b4-kube-api-access-h466g\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.767606 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.767664 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-scripts\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.869507 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-config-data\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.869562 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h466g\" (UniqueName: \"kubernetes.io/projected/4755a125-256f-47bf-8ed7-10c4662f10b4-kube-api-access-h466g\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.869604 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.869682 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-scripts\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.876374 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.877198 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-config-data\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.885237 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-scripts\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.896412 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h466g\" (UniqueName: \"kubernetes.io/projected/4755a125-256f-47bf-8ed7-10c4662f10b4-kube-api-access-h466g\") pod \"nova-cell1-cell-mapping-8hvd4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:23 crc kubenswrapper[4763]: I1006 15:14:23.994807 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:24 crc kubenswrapper[4763]: I1006 15:14:24.432768 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8hvd4"] Oct 06 15:14:24 crc kubenswrapper[4763]: W1006 15:14:24.449478 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4755a125_256f_47bf_8ed7_10c4662f10b4.slice/crio-d7b4413876bc9c0b1bf9dff8aa59f9abd62c4daa81c9c49c034721b03052f9de WatchSource:0}: Error finding container d7b4413876bc9c0b1bf9dff8aa59f9abd62c4daa81c9c49c034721b03052f9de: Status 404 returned error can't find the container with id d7b4413876bc9c0b1bf9dff8aa59f9abd62c4daa81c9c49c034721b03052f9de Oct 06 15:14:24 crc kubenswrapper[4763]: I1006 15:14:24.470587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78411411-8959-4af9-9396-864a5dc9f0b1","Type":"ContainerStarted","Data":"c4dfdcfa2f39a9eade765b9d6b125e93d688c2a02a76e0cac1864c972b1b1c39"} Oct 06 15:14:24 crc kubenswrapper[4763]: I1006 15:14:24.472009 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8hvd4" event={"ID":"4755a125-256f-47bf-8ed7-10c4662f10b4","Type":"ContainerStarted","Data":"d7b4413876bc9c0b1bf9dff8aa59f9abd62c4daa81c9c49c034721b03052f9de"} Oct 06 15:14:25 crc kubenswrapper[4763]: I1006 15:14:25.484030 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78411411-8959-4af9-9396-864a5dc9f0b1","Type":"ContainerStarted","Data":"b7afa41ebd51bd83378246bbfccfbc1dec4a81fdcfd2d71bd611a3acb01ef475"} Oct 06 15:14:25 crc kubenswrapper[4763]: I1006 15:14:25.486289 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8hvd4" event={"ID":"4755a125-256f-47bf-8ed7-10c4662f10b4","Type":"ContainerStarted","Data":"3639f3cd97ad76ec5113dd306c9f03ea2aa1f8d9570485be1cca76f7d68fed2f"} Oct 06 15:14:25 crc kubenswrapper[4763]: I1006 15:14:25.503693 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8hvd4" podStartSLOduration=2.503674397 podStartE2EDuration="2.503674397s" podCreationTimestamp="2025-10-06 15:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:14:25.500421377 +0000 UTC m=+1262.655713899" watchObservedRunningTime="2025-10-06 15:14:25.503674397 +0000 UTC m=+1262.658966919" Oct 06 15:14:26 crc kubenswrapper[4763]: I1006 15:14:26.499879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78411411-8959-4af9-9396-864a5dc9f0b1","Type":"ContainerStarted","Data":"c09071984a4aec419c3daa2a55dbf3605ca256c2a5199a60850abcd7fd764271"} Oct 06 15:14:26 crc kubenswrapper[4763]: I1006 15:14:26.500223 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 15:14:26 crc kubenswrapper[4763]: I1006 15:14:26.536730 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.944175601 podStartE2EDuration="5.536705866s" podCreationTimestamp="2025-10-06 15:14:21 +0000 UTC" firstStartedPulling="2025-10-06 15:14:22.292962174 +0000 UTC m=+1259.448254686" lastFinishedPulling="2025-10-06 15:14:25.885492429 +0000 UTC m=+1263.040784951" observedRunningTime="2025-10-06 15:14:26.534074823 +0000 UTC m=+1263.689367345" watchObservedRunningTime="2025-10-06 15:14:26.536705866 +0000 UTC m=+1263.691998388" Oct 06 15:14:26 crc kubenswrapper[4763]: I1006 15:14:26.768244 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 15:14:26 crc kubenswrapper[4763]: I1006 15:14:26.768310 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 15:14:27 crc kubenswrapper[4763]: I1006 15:14:27.785939 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7feebe60-edd5-41a6-86da-be1127438714" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 15:14:27 crc kubenswrapper[4763]: I1006 15:14:27.785963 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7feebe60-edd5-41a6-86da-be1127438714" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 15:14:29 crc kubenswrapper[4763]: I1006 15:14:29.539282 4763 generic.go:334] "Generic (PLEG): container finished" podID="4755a125-256f-47bf-8ed7-10c4662f10b4" containerID="3639f3cd97ad76ec5113dd306c9f03ea2aa1f8d9570485be1cca76f7d68fed2f" exitCode=0 Oct 06 15:14:29 crc kubenswrapper[4763]: I1006 15:14:29.539340 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8hvd4" event={"ID":"4755a125-256f-47bf-8ed7-10c4662f10b4","Type":"ContainerDied","Data":"3639f3cd97ad76ec5113dd306c9f03ea2aa1f8d9570485be1cca76f7d68fed2f"} Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.014668 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.122648 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h466g\" (UniqueName: \"kubernetes.io/projected/4755a125-256f-47bf-8ed7-10c4662f10b4-kube-api-access-h466g\") pod \"4755a125-256f-47bf-8ed7-10c4662f10b4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.122736 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-scripts\") pod \"4755a125-256f-47bf-8ed7-10c4662f10b4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.122850 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-config-data\") pod \"4755a125-256f-47bf-8ed7-10c4662f10b4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.122891 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-combined-ca-bundle\") pod \"4755a125-256f-47bf-8ed7-10c4662f10b4\" (UID: \"4755a125-256f-47bf-8ed7-10c4662f10b4\") " Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.128670 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4755a125-256f-47bf-8ed7-10c4662f10b4-kube-api-access-h466g" (OuterVolumeSpecName: "kube-api-access-h466g") pod "4755a125-256f-47bf-8ed7-10c4662f10b4" (UID: "4755a125-256f-47bf-8ed7-10c4662f10b4"). InnerVolumeSpecName "kube-api-access-h466g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.137224 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-scripts" (OuterVolumeSpecName: "scripts") pod "4755a125-256f-47bf-8ed7-10c4662f10b4" (UID: "4755a125-256f-47bf-8ed7-10c4662f10b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.151306 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4755a125-256f-47bf-8ed7-10c4662f10b4" (UID: "4755a125-256f-47bf-8ed7-10c4662f10b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.160970 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-config-data" (OuterVolumeSpecName: "config-data") pod "4755a125-256f-47bf-8ed7-10c4662f10b4" (UID: "4755a125-256f-47bf-8ed7-10c4662f10b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.225329 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h466g\" (UniqueName: \"kubernetes.io/projected/4755a125-256f-47bf-8ed7-10c4662f10b4-kube-api-access-h466g\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.225374 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.225386 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.225399 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4755a125-256f-47bf-8ed7-10c4662f10b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.564860 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8hvd4" event={"ID":"4755a125-256f-47bf-8ed7-10c4662f10b4","Type":"ContainerDied","Data":"d7b4413876bc9c0b1bf9dff8aa59f9abd62c4daa81c9c49c034721b03052f9de"} Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.564906 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7b4413876bc9c0b1bf9dff8aa59f9abd62c4daa81c9c49c034721b03052f9de" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.565014 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8hvd4" Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.755001 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.755239 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7feebe60-edd5-41a6-86da-be1127438714" containerName="nova-api-log" containerID="cri-o://7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835" gracePeriod=30 Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.755359 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7feebe60-edd5-41a6-86da-be1127438714" containerName="nova-api-api" containerID="cri-o://da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301" gracePeriod=30 Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.783744 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.784207 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5cdea547-7521-4ef5-ac59-a079224a4577" containerName="nova-scheduler-scheduler" containerID="cri-o://757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690" gracePeriod=30 Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.904647 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.904974 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-log" containerID="cri-o://2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833" gracePeriod=30 Oct 06 15:14:31 crc kubenswrapper[4763]: I1006 15:14:31.905561 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-metadata" containerID="cri-o://f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719" gracePeriod=30 Oct 06 15:14:32 crc kubenswrapper[4763]: I1006 15:14:32.579028 4763 generic.go:334] "Generic (PLEG): container finished" podID="7feebe60-edd5-41a6-86da-be1127438714" containerID="7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835" exitCode=143 Oct 06 15:14:32 crc kubenswrapper[4763]: I1006 15:14:32.579133 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7feebe60-edd5-41a6-86da-be1127438714","Type":"ContainerDied","Data":"7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835"} Oct 06 15:14:32 crc kubenswrapper[4763]: I1006 15:14:32.582040 4763 generic.go:334] "Generic (PLEG): container finished" podID="a7276637-816e-4a30-85a1-9968546dfc7d" containerID="2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833" exitCode=143 Oct 06 15:14:32 crc kubenswrapper[4763]: I1006 15:14:32.582083 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7276637-816e-4a30-85a1-9968546dfc7d","Type":"ContainerDied","Data":"2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833"} Oct 06 15:14:33 crc kubenswrapper[4763]: I1006 15:14:33.877337 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:14:33 crc kubenswrapper[4763]: I1006 15:14:33.878451 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:14:33 crc kubenswrapper[4763]: I1006 15:14:33.878655 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:14:33 crc kubenswrapper[4763]: I1006 15:14:33.879811 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cc326b9fc4c544f26c5cf613aae0ed392475a5aae225e91805a959c1915374a"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:14:33 crc kubenswrapper[4763]: I1006 15:14:33.880088 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://2cc326b9fc4c544f26c5cf613aae0ed392475a5aae225e91805a959c1915374a" gracePeriod=600 Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.384466 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.485118 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-config-data\") pod \"5cdea547-7521-4ef5-ac59-a079224a4577\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.485183 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xlmk\" (UniqueName: \"kubernetes.io/projected/5cdea547-7521-4ef5-ac59-a079224a4577-kube-api-access-2xlmk\") pod \"5cdea547-7521-4ef5-ac59-a079224a4577\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.485238 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-combined-ca-bundle\") pod \"5cdea547-7521-4ef5-ac59-a079224a4577\" (UID: \"5cdea547-7521-4ef5-ac59-a079224a4577\") " Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.490544 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cdea547-7521-4ef5-ac59-a079224a4577-kube-api-access-2xlmk" (OuterVolumeSpecName: "kube-api-access-2xlmk") pod "5cdea547-7521-4ef5-ac59-a079224a4577" (UID: "5cdea547-7521-4ef5-ac59-a079224a4577"). InnerVolumeSpecName "kube-api-access-2xlmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.514270 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-config-data" (OuterVolumeSpecName: "config-data") pod "5cdea547-7521-4ef5-ac59-a079224a4577" (UID: "5cdea547-7521-4ef5-ac59-a079224a4577"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.515900 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cdea547-7521-4ef5-ac59-a079224a4577" (UID: "5cdea547-7521-4ef5-ac59-a079224a4577"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.588214 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.588249 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xlmk\" (UniqueName: \"kubernetes.io/projected/5cdea547-7521-4ef5-ac59-a079224a4577-kube-api-access-2xlmk\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.588260 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cdea547-7521-4ef5-ac59-a079224a4577-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.616436 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="2cc326b9fc4c544f26c5cf613aae0ed392475a5aae225e91805a959c1915374a" exitCode=0 Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.616489 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"2cc326b9fc4c544f26c5cf613aae0ed392475a5aae225e91805a959c1915374a"} Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.616549 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"598b0409ddd13b1cc4decd02be05f861e61f403a2ac96c7d8d43d4a7f393169e"} Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.616575 4763 scope.go:117] "RemoveContainer" containerID="6fbccdc9483352b1f55c48bbe8b493186e2536c8da1ef82629a7b6dcba09e9ea" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.619097 4763 generic.go:334] "Generic (PLEG): container finished" podID="5cdea547-7521-4ef5-ac59-a079224a4577" containerID="757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690" exitCode=0 Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.619134 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5cdea547-7521-4ef5-ac59-a079224a4577","Type":"ContainerDied","Data":"757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690"} Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.619172 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5cdea547-7521-4ef5-ac59-a079224a4577","Type":"ContainerDied","Data":"8eada450cd30387121614f710fb08b0fc8effc99159c2393f509c3cee44fe499"} Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.619178 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.659701 4763 scope.go:117] "RemoveContainer" containerID="757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.669553 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.680056 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.689026 4763 scope.go:117] "RemoveContainer" containerID="757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690" Oct 06 15:14:34 crc kubenswrapper[4763]: E1006 15:14:34.689547 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690\": container with ID starting with 757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690 not found: ID does not exist" containerID="757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.689590 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690"} err="failed to get container status \"757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690\": rpc error: code = NotFound desc = could not find container \"757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690\": container with ID starting with 757998228ef48133510132c96ad6815bcb21966320a8329d5064d23728e2a690 not found: ID does not exist" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.693849 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:14:34 crc kubenswrapper[4763]: E1006 15:14:34.694287 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cdea547-7521-4ef5-ac59-a079224a4577" containerName="nova-scheduler-scheduler" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.694326 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cdea547-7521-4ef5-ac59-a079224a4577" containerName="nova-scheduler-scheduler" Oct 06 15:14:34 crc kubenswrapper[4763]: E1006 15:14:34.694364 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4755a125-256f-47bf-8ed7-10c4662f10b4" containerName="nova-manage" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.694373 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4755a125-256f-47bf-8ed7-10c4662f10b4" containerName="nova-manage" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.694591 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cdea547-7521-4ef5-ac59-a079224a4577" containerName="nova-scheduler-scheduler" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.694609 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4755a125-256f-47bf-8ed7-10c4662f10b4" containerName="nova-manage" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.695256 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.698035 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.703470 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.792160 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.792281 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt5hd\" (UniqueName: \"kubernetes.io/projected/5c67adc5-b329-4832-a9e6-711a70d0021e-kube-api-access-rt5hd\") pod \"nova-scheduler-0\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.792909 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-config-data\") pod \"nova-scheduler-0\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.894731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-config-data\") pod \"nova-scheduler-0\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.894844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.894871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt5hd\" (UniqueName: \"kubernetes.io/projected/5c67adc5-b329-4832-a9e6-711a70d0021e-kube-api-access-rt5hd\") pod \"nova-scheduler-0\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.901447 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-config-data\") pod \"nova-scheduler-0\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.901741 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " pod="openstack/nova-scheduler-0" Oct 06 15:14:34 crc kubenswrapper[4763]: I1006 15:14:34.917674 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt5hd\" (UniqueName: \"kubernetes.io/projected/5c67adc5-b329-4832-a9e6-711a70d0021e-kube-api-access-rt5hd\") pod \"nova-scheduler-0\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " pod="openstack/nova-scheduler-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.013294 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.055728 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:56288->10.217.0.191:8775: read: connection reset by peer" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.055754 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:56286->10.217.0.191:8775: read: connection reset by peer" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.414421 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.498537 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.505090 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x6wl\" (UniqueName: \"kubernetes.io/projected/7feebe60-edd5-41a6-86da-be1127438714-kube-api-access-8x6wl\") pod \"7feebe60-edd5-41a6-86da-be1127438714\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.505171 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-config-data\") pod \"7feebe60-edd5-41a6-86da-be1127438714\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.505225 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7feebe60-edd5-41a6-86da-be1127438714-logs\") pod \"7feebe60-edd5-41a6-86da-be1127438714\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.505295 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-combined-ca-bundle\") pod \"7feebe60-edd5-41a6-86da-be1127438714\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.505395 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-public-tls-certs\") pod \"7feebe60-edd5-41a6-86da-be1127438714\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.505427 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-internal-tls-certs\") pod \"7feebe60-edd5-41a6-86da-be1127438714\" (UID: \"7feebe60-edd5-41a6-86da-be1127438714\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.505684 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7feebe60-edd5-41a6-86da-be1127438714-logs" (OuterVolumeSpecName: "logs") pod "7feebe60-edd5-41a6-86da-be1127438714" (UID: "7feebe60-edd5-41a6-86da-be1127438714"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.507173 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7feebe60-edd5-41a6-86da-be1127438714-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.511698 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7feebe60-edd5-41a6-86da-be1127438714-kube-api-access-8x6wl" (OuterVolumeSpecName: "kube-api-access-8x6wl") pod "7feebe60-edd5-41a6-86da-be1127438714" (UID: "7feebe60-edd5-41a6-86da-be1127438714"). InnerVolumeSpecName "kube-api-access-8x6wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.523211 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.578107 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7feebe60-edd5-41a6-86da-be1127438714" (UID: "7feebe60-edd5-41a6-86da-be1127438714"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.588842 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-config-data" (OuterVolumeSpecName: "config-data") pod "7feebe60-edd5-41a6-86da-be1127438714" (UID: "7feebe60-edd5-41a6-86da-be1127438714"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.596923 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7feebe60-edd5-41a6-86da-be1127438714" (UID: "7feebe60-edd5-41a6-86da-be1127438714"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.603567 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cdea547-7521-4ef5-ac59-a079224a4577" path="/var/lib/kubelet/pods/5cdea547-7521-4ef5-ac59-a079224a4577/volumes" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.608656 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26vxn\" (UniqueName: \"kubernetes.io/projected/a7276637-816e-4a30-85a1-9968546dfc7d-kube-api-access-26vxn\") pod \"a7276637-816e-4a30-85a1-9968546dfc7d\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.608824 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7276637-816e-4a30-85a1-9968546dfc7d-logs\") pod \"a7276637-816e-4a30-85a1-9968546dfc7d\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.608868 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-nova-metadata-tls-certs\") pod \"a7276637-816e-4a30-85a1-9968546dfc7d\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.608918 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-config-data\") pod \"a7276637-816e-4a30-85a1-9968546dfc7d\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.608964 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-combined-ca-bundle\") pod \"a7276637-816e-4a30-85a1-9968546dfc7d\" (UID: \"a7276637-816e-4a30-85a1-9968546dfc7d\") " Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.609352 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.609371 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x6wl\" (UniqueName: \"kubernetes.io/projected/7feebe60-edd5-41a6-86da-be1127438714-kube-api-access-8x6wl\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.609385 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.609396 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.609700 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7276637-816e-4a30-85a1-9968546dfc7d-logs" (OuterVolumeSpecName: "logs") pod "a7276637-816e-4a30-85a1-9968546dfc7d" (UID: "a7276637-816e-4a30-85a1-9968546dfc7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.616192 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7276637-816e-4a30-85a1-9968546dfc7d-kube-api-access-26vxn" (OuterVolumeSpecName: "kube-api-access-26vxn") pod "a7276637-816e-4a30-85a1-9968546dfc7d" (UID: "a7276637-816e-4a30-85a1-9968546dfc7d"). InnerVolumeSpecName "kube-api-access-26vxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.616503 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7feebe60-edd5-41a6-86da-be1127438714" (UID: "7feebe60-edd5-41a6-86da-be1127438714"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.660012 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c67adc5-b329-4832-a9e6-711a70d0021e","Type":"ContainerStarted","Data":"1d1058d2f7ca6c9aac6d91aa892c8622b8de5cddf8a0ed3a9e8c133fb306f6a0"} Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.672923 4763 generic.go:334] "Generic (PLEG): container finished" podID="7feebe60-edd5-41a6-86da-be1127438714" containerID="da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301" exitCode=0 Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.672994 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.673020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7feebe60-edd5-41a6-86da-be1127438714","Type":"ContainerDied","Data":"da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301"} Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.673070 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7feebe60-edd5-41a6-86da-be1127438714","Type":"ContainerDied","Data":"301c00f086065495e3029a82080d6fccfe9eb8b656c2deef5eb80e9a5ef4d125"} Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.673087 4763 scope.go:117] "RemoveContainer" containerID="da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.675009 4763 generic.go:334] "Generic (PLEG): container finished" podID="a7276637-816e-4a30-85a1-9968546dfc7d" containerID="f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719" exitCode=0 Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.675075 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.675102 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7276637-816e-4a30-85a1-9968546dfc7d","Type":"ContainerDied","Data":"f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719"} Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.675155 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7276637-816e-4a30-85a1-9968546dfc7d","Type":"ContainerDied","Data":"8db1c7e515f46fa2a5d398392ce7704f34918112c500df73e8606889cd188bf6"} Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.675143 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-config-data" (OuterVolumeSpecName: "config-data") pod "a7276637-816e-4a30-85a1-9968546dfc7d" (UID: "a7276637-816e-4a30-85a1-9968546dfc7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.677007 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7276637-816e-4a30-85a1-9968546dfc7d" (UID: "a7276637-816e-4a30-85a1-9968546dfc7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.693402 4763 scope.go:117] "RemoveContainer" containerID="7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.706592 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a7276637-816e-4a30-85a1-9968546dfc7d" (UID: "a7276637-816e-4a30-85a1-9968546dfc7d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.709813 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.710976 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7feebe60-edd5-41a6-86da-be1127438714-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.711002 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26vxn\" (UniqueName: \"kubernetes.io/projected/a7276637-816e-4a30-85a1-9968546dfc7d-kube-api-access-26vxn\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.711014 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7276637-816e-4a30-85a1-9968546dfc7d-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.711025 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.711035 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.711046 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7276637-816e-4a30-85a1-9968546dfc7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.726145 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.736436 4763 scope.go:117] "RemoveContainer" containerID="da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.736563 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:35 crc kubenswrapper[4763]: E1006 15:14:35.737154 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7feebe60-edd5-41a6-86da-be1127438714" containerName="nova-api-log" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.737180 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7feebe60-edd5-41a6-86da-be1127438714" containerName="nova-api-log" Oct 06 15:14:35 crc kubenswrapper[4763]: E1006 15:14:35.737195 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-log" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.737205 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-log" Oct 06 15:14:35 crc kubenswrapper[4763]: E1006 15:14:35.737224 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7feebe60-edd5-41a6-86da-be1127438714" containerName="nova-api-api" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.737231 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7feebe60-edd5-41a6-86da-be1127438714" containerName="nova-api-api" Oct 06 15:14:35 crc kubenswrapper[4763]: E1006 15:14:35.737261 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-metadata" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.737272 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-metadata" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.737502 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-log" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.737534 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" containerName="nova-metadata-metadata" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.737554 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7feebe60-edd5-41a6-86da-be1127438714" containerName="nova-api-api" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.737568 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7feebe60-edd5-41a6-86da-be1127438714" containerName="nova-api-log" Oct 06 15:14:35 crc kubenswrapper[4763]: E1006 15:14:35.737961 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301\": container with ID starting with da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301 not found: ID does not exist" containerID="da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.738009 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301"} err="failed to get container status \"da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301\": rpc error: code = NotFound desc = could not find container \"da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301\": container with ID starting with da3cb15c376c8de6bdf2fb21ec753608c72c006bda506ebeb477242ded6cc301 not found: ID does not exist" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.738038 4763 scope.go:117] "RemoveContainer" containerID="7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835" Oct 06 15:14:35 crc kubenswrapper[4763]: E1006 15:14:35.738438 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835\": container with ID starting with 7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835 not found: ID does not exist" containerID="7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.738471 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835"} err="failed to get container status \"7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835\": rpc error: code = NotFound desc = could not find container \"7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835\": container with ID starting with 7205cd2129c9376ac1c5abedfce13fdd3ab85eec202862bd609156821e928835 not found: ID does not exist" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.738492 4763 scope.go:117] "RemoveContainer" containerID="f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.739168 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.741168 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.742007 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.742329 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.761775 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.770949 4763 scope.go:117] "RemoveContainer" containerID="2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.794821 4763 scope.go:117] "RemoveContainer" containerID="f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719" Oct 06 15:14:35 crc kubenswrapper[4763]: E1006 15:14:35.795197 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719\": container with ID starting with f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719 not found: ID does not exist" containerID="f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.795232 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719"} err="failed to get container status \"f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719\": rpc error: code = NotFound desc = could not find container \"f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719\": container with ID starting with f8ac59fd4db443791d49921ba298fe2bdefede885398e306f5915c1560a8a719 not found: ID does not exist" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.795261 4763 scope.go:117] "RemoveContainer" containerID="2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833" Oct 06 15:14:35 crc kubenswrapper[4763]: E1006 15:14:35.795519 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833\": container with ID starting with 2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833 not found: ID does not exist" containerID="2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.795559 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833"} err="failed to get container status \"2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833\": rpc error: code = NotFound desc = could not find container \"2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833\": container with ID starting with 2cf376f2993d07dbd69bd81db1ef6d4d4eacf24ef9e57bcd7094559d9e0a3833 not found: ID does not exist" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.812193 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-public-tls-certs\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.812245 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.812270 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.812292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78f9k\" (UniqueName: \"kubernetes.io/projected/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-kube-api-access-78f9k\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.812417 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-logs\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.812444 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-config-data\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.914158 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78f9k\" (UniqueName: \"kubernetes.io/projected/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-kube-api-access-78f9k\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.914478 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-logs\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.914510 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-config-data\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.914554 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-public-tls-certs\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.914577 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.914594 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.915189 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-logs\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.919457 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.921211 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.921864 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-public-tls-certs\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.921945 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-config-data\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:35 crc kubenswrapper[4763]: I1006 15:14:35.930302 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78f9k\" (UniqueName: \"kubernetes.io/projected/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-kube-api-access-78f9k\") pod \"nova-api-0\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " pod="openstack/nova-api-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.025931 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.040067 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.048711 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.050263 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.052108 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.055343 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.072558 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.073025 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.220357 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-config-data\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.220386 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18b427d1-75e9-4b32-afeb-f895661ddbe1-logs\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.220434 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.220494 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdwvc\" (UniqueName: \"kubernetes.io/projected/18b427d1-75e9-4b32-afeb-f895661ddbe1-kube-api-access-mdwvc\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.220526 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.322416 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-config-data\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.322455 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18b427d1-75e9-4b32-afeb-f895661ddbe1-logs\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.322506 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.322567 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdwvc\" (UniqueName: \"kubernetes.io/projected/18b427d1-75e9-4b32-afeb-f895661ddbe1-kube-api-access-mdwvc\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.322626 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.323747 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18b427d1-75e9-4b32-afeb-f895661ddbe1-logs\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.328369 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-config-data\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.330111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.340178 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.340886 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdwvc\" (UniqueName: \"kubernetes.io/projected/18b427d1-75e9-4b32-afeb-f895661ddbe1-kube-api-access-mdwvc\") pod \"nova-metadata-0\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.478315 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.510291 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:14:36 crc kubenswrapper[4763]: W1006 15:14:36.517364 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bebf8d6_16bb_4dcf_afac_6a1a55e81350.slice/crio-84962c4492800f166ff497fa724946007caf829226e542df831dcf6fc6d272ae WatchSource:0}: Error finding container 84962c4492800f166ff497fa724946007caf829226e542df831dcf6fc6d272ae: Status 404 returned error can't find the container with id 84962c4492800f166ff497fa724946007caf829226e542df831dcf6fc6d272ae Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.695428 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c67adc5-b329-4832-a9e6-711a70d0021e","Type":"ContainerStarted","Data":"eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab"} Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.698240 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5bebf8d6-16bb-4dcf-afac-6a1a55e81350","Type":"ContainerStarted","Data":"4f656fbf184d6e7e7f0539b3b3bd12e503563cb2565c711ff5b955951ea221f9"} Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.698278 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5bebf8d6-16bb-4dcf-afac-6a1a55e81350","Type":"ContainerStarted","Data":"84962c4492800f166ff497fa724946007caf829226e542df831dcf6fc6d272ae"} Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.711848 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.711829539 podStartE2EDuration="2.711829539s" podCreationTimestamp="2025-10-06 15:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:14:36.711205732 +0000 UTC m=+1273.866498274" watchObservedRunningTime="2025-10-06 15:14:36.711829539 +0000 UTC m=+1273.867122071" Oct 06 15:14:36 crc kubenswrapper[4763]: I1006 15:14:36.949954 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:14:36 crc kubenswrapper[4763]: W1006 15:14:36.955509 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b427d1_75e9_4b32_afeb_f895661ddbe1.slice/crio-fe4e317a632f70a76770c9025334e8eabeb0a3254b6ca161eaa753f85e27f82c WatchSource:0}: Error finding container fe4e317a632f70a76770c9025334e8eabeb0a3254b6ca161eaa753f85e27f82c: Status 404 returned error can't find the container with id fe4e317a632f70a76770c9025334e8eabeb0a3254b6ca161eaa753f85e27f82c Oct 06 15:14:37 crc kubenswrapper[4763]: I1006 15:14:37.585865 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7feebe60-edd5-41a6-86da-be1127438714" path="/var/lib/kubelet/pods/7feebe60-edd5-41a6-86da-be1127438714/volumes" Oct 06 15:14:37 crc kubenswrapper[4763]: I1006 15:14:37.586932 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7276637-816e-4a30-85a1-9968546dfc7d" path="/var/lib/kubelet/pods/a7276637-816e-4a30-85a1-9968546dfc7d/volumes" Oct 06 15:14:37 crc kubenswrapper[4763]: I1006 15:14:37.711316 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5bebf8d6-16bb-4dcf-afac-6a1a55e81350","Type":"ContainerStarted","Data":"adcaf8c8d3f281041745bafa4ac6ec897a52c954a8e11f85b8df351c7b2de708"} Oct 06 15:14:37 crc kubenswrapper[4763]: I1006 15:14:37.715256 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18b427d1-75e9-4b32-afeb-f895661ddbe1","Type":"ContainerStarted","Data":"3371eb2323b8c35c65358b46ed5e998f521b7d07b05bcf5e5f18ac2589423079"} Oct 06 15:14:37 crc kubenswrapper[4763]: I1006 15:14:37.715294 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18b427d1-75e9-4b32-afeb-f895661ddbe1","Type":"ContainerStarted","Data":"c218a29eb7d3663d26274e3363b8a572e07d93eb706bdfc4ab5ab9ee292f214b"} Oct 06 15:14:37 crc kubenswrapper[4763]: I1006 15:14:37.715308 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18b427d1-75e9-4b32-afeb-f895661ddbe1","Type":"ContainerStarted","Data":"fe4e317a632f70a76770c9025334e8eabeb0a3254b6ca161eaa753f85e27f82c"} Oct 06 15:14:37 crc kubenswrapper[4763]: I1006 15:14:37.745900 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.745850905 podStartE2EDuration="2.745850905s" podCreationTimestamp="2025-10-06 15:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:14:37.731568612 +0000 UTC m=+1274.886861144" watchObservedRunningTime="2025-10-06 15:14:37.745850905 +0000 UTC m=+1274.901143427" Oct 06 15:14:37 crc kubenswrapper[4763]: I1006 15:14:37.763541 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.763518412 podStartE2EDuration="1.763518412s" podCreationTimestamp="2025-10-06 15:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:14:37.755237204 +0000 UTC m=+1274.910529736" watchObservedRunningTime="2025-10-06 15:14:37.763518412 +0000 UTC m=+1274.918810934" Oct 06 15:14:40 crc kubenswrapper[4763]: I1006 15:14:40.013752 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 15:14:41 crc kubenswrapper[4763]: I1006 15:14:41.478836 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 15:14:41 crc kubenswrapper[4763]: I1006 15:14:41.478913 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 15:14:45 crc kubenswrapper[4763]: I1006 15:14:45.014476 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 15:14:45 crc kubenswrapper[4763]: I1006 15:14:45.062426 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 15:14:45 crc kubenswrapper[4763]: I1006 15:14:45.865321 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 15:14:46 crc kubenswrapper[4763]: I1006 15:14:46.073697 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 15:14:46 crc kubenswrapper[4763]: I1006 15:14:46.074101 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 15:14:46 crc kubenswrapper[4763]: I1006 15:14:46.479257 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 15:14:46 crc kubenswrapper[4763]: I1006 15:14:46.479297 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 15:14:47 crc kubenswrapper[4763]: I1006 15:14:47.081835 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 15:14:47 crc kubenswrapper[4763]: I1006 15:14:47.090992 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 15:14:47 crc kubenswrapper[4763]: I1006 15:14:47.489824 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 15:14:47 crc kubenswrapper[4763]: I1006 15:14:47.489852 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 15:14:51 crc kubenswrapper[4763]: I1006 15:14:51.855931 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 15:14:56 crc kubenswrapper[4763]: I1006 15:14:56.080883 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 15:14:56 crc kubenswrapper[4763]: I1006 15:14:56.081691 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 15:14:56 crc kubenswrapper[4763]: I1006 15:14:56.087570 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 15:14:56 crc kubenswrapper[4763]: I1006 15:14:56.094427 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 15:14:56 crc kubenswrapper[4763]: I1006 15:14:56.485825 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 15:14:56 crc kubenswrapper[4763]: I1006 15:14:56.491702 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 15:14:56 crc kubenswrapper[4763]: I1006 15:14:56.493802 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 15:14:56 crc kubenswrapper[4763]: I1006 15:14:56.956926 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 15:14:56 crc kubenswrapper[4763]: I1006 15:14:56.964337 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 15:14:56 crc kubenswrapper[4763]: I1006 15:14:56.977744 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.164364 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr"] Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.166321 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.169361 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.169429 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.175976 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr"] Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.240521 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl2vn\" (UniqueName: \"kubernetes.io/projected/e57c7e43-942b-418a-978c-87a2e535d430-kube-api-access-dl2vn\") pod \"collect-profiles-29329395-lgkmr\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.240790 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e57c7e43-942b-418a-978c-87a2e535d430-secret-volume\") pod \"collect-profiles-29329395-lgkmr\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.240888 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e57c7e43-942b-418a-978c-87a2e535d430-config-volume\") pod \"collect-profiles-29329395-lgkmr\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.342395 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e57c7e43-942b-418a-978c-87a2e535d430-secret-volume\") pod \"collect-profiles-29329395-lgkmr\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.342522 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e57c7e43-942b-418a-978c-87a2e535d430-config-volume\") pod \"collect-profiles-29329395-lgkmr\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.342888 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl2vn\" (UniqueName: \"kubernetes.io/projected/e57c7e43-942b-418a-978c-87a2e535d430-kube-api-access-dl2vn\") pod \"collect-profiles-29329395-lgkmr\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.344067 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e57c7e43-942b-418a-978c-87a2e535d430-config-volume\") pod \"collect-profiles-29329395-lgkmr\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.350055 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e57c7e43-942b-418a-978c-87a2e535d430-secret-volume\") pod \"collect-profiles-29329395-lgkmr\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.365844 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl2vn\" (UniqueName: \"kubernetes.io/projected/e57c7e43-942b-418a-978c-87a2e535d430-kube-api-access-dl2vn\") pod \"collect-profiles-29329395-lgkmr\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.495750 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:00 crc kubenswrapper[4763]: I1006 15:15:00.975758 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr"] Oct 06 15:15:00 crc kubenswrapper[4763]: W1006 15:15:00.976360 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode57c7e43_942b_418a_978c_87a2e535d430.slice/crio-06d0eb9f9576e5a51044063ea64387d0f3837dc786f57af2d956508c6f4a77d7 WatchSource:0}: Error finding container 06d0eb9f9576e5a51044063ea64387d0f3837dc786f57af2d956508c6f4a77d7: Status 404 returned error can't find the container with id 06d0eb9f9576e5a51044063ea64387d0f3837dc786f57af2d956508c6f4a77d7 Oct 06 15:15:01 crc kubenswrapper[4763]: I1006 15:15:01.001811 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" event={"ID":"e57c7e43-942b-418a-978c-87a2e535d430","Type":"ContainerStarted","Data":"06d0eb9f9576e5a51044063ea64387d0f3837dc786f57af2d956508c6f4a77d7"} Oct 06 15:15:02 crc kubenswrapper[4763]: I1006 15:15:02.017370 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57c7e43-942b-418a-978c-87a2e535d430" containerID="1064598d9fd0ba5a4231a251144278663c79cbd4c6f961d209063c216fe15c4a" exitCode=0 Oct 06 15:15:02 crc kubenswrapper[4763]: I1006 15:15:02.017445 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" event={"ID":"e57c7e43-942b-418a-978c-87a2e535d430","Type":"ContainerDied","Data":"1064598d9fd0ba5a4231a251144278663c79cbd4c6f961d209063c216fe15c4a"} Oct 06 15:15:03 crc kubenswrapper[4763]: I1006 15:15:03.429573 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:03 crc kubenswrapper[4763]: I1006 15:15:03.505523 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e57c7e43-942b-418a-978c-87a2e535d430-config-volume\") pod \"e57c7e43-942b-418a-978c-87a2e535d430\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " Oct 06 15:15:03 crc kubenswrapper[4763]: I1006 15:15:03.505881 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl2vn\" (UniqueName: \"kubernetes.io/projected/e57c7e43-942b-418a-978c-87a2e535d430-kube-api-access-dl2vn\") pod \"e57c7e43-942b-418a-978c-87a2e535d430\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " Oct 06 15:15:03 crc kubenswrapper[4763]: I1006 15:15:03.506042 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e57c7e43-942b-418a-978c-87a2e535d430-secret-volume\") pod \"e57c7e43-942b-418a-978c-87a2e535d430\" (UID: \"e57c7e43-942b-418a-978c-87a2e535d430\") " Oct 06 15:15:03 crc kubenswrapper[4763]: I1006 15:15:03.506361 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57c7e43-942b-418a-978c-87a2e535d430-config-volume" (OuterVolumeSpecName: "config-volume") pod "e57c7e43-942b-418a-978c-87a2e535d430" (UID: "e57c7e43-942b-418a-978c-87a2e535d430"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:03 crc kubenswrapper[4763]: I1006 15:15:03.507030 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e57c7e43-942b-418a-978c-87a2e535d430-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:03 crc kubenswrapper[4763]: I1006 15:15:03.514430 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57c7e43-942b-418a-978c-87a2e535d430-kube-api-access-dl2vn" (OuterVolumeSpecName: "kube-api-access-dl2vn") pod "e57c7e43-942b-418a-978c-87a2e535d430" (UID: "e57c7e43-942b-418a-978c-87a2e535d430"). InnerVolumeSpecName "kube-api-access-dl2vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:03 crc kubenswrapper[4763]: I1006 15:15:03.515363 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57c7e43-942b-418a-978c-87a2e535d430-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e57c7e43-942b-418a-978c-87a2e535d430" (UID: "e57c7e43-942b-418a-978c-87a2e535d430"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:03 crc kubenswrapper[4763]: I1006 15:15:03.608951 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl2vn\" (UniqueName: \"kubernetes.io/projected/e57c7e43-942b-418a-978c-87a2e535d430-kube-api-access-dl2vn\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:03 crc kubenswrapper[4763]: I1006 15:15:03.609004 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e57c7e43-942b-418a-978c-87a2e535d430-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:04 crc kubenswrapper[4763]: I1006 15:15:04.046117 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" event={"ID":"e57c7e43-942b-418a-978c-87a2e535d430","Type":"ContainerDied","Data":"06d0eb9f9576e5a51044063ea64387d0f3837dc786f57af2d956508c6f4a77d7"} Oct 06 15:15:04 crc kubenswrapper[4763]: I1006 15:15:04.046172 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06d0eb9f9576e5a51044063ea64387d0f3837dc786f57af2d956508c6f4a77d7" Oct 06 15:15:04 crc kubenswrapper[4763]: I1006 15:15:04.046231 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr" Oct 06 15:15:17 crc kubenswrapper[4763]: I1006 15:15:17.923442 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 06 15:15:17 crc kubenswrapper[4763]: I1006 15:15:17.924135 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f62b41ca-5e6d-4760-be62-af924b841737" containerName="openstackclient" containerID="cri-o://ba6cd5ea4c4214f0c3ffef878583fe94e05f8dc0d7cd4b6711ee7dd59935b9a6" gracePeriod=2 Oct 06 15:15:17 crc kubenswrapper[4763]: I1006 15:15:17.940734 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.010362 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 15:15:18 crc kubenswrapper[4763]: E1006 15:15:18.117135 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 06 15:15:18 crc kubenswrapper[4763]: E1006 15:15:18.117188 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data podName:2fad9bbe-33dc-4f1d-a156-52bbd3a69273 nodeName:}" failed. No retries permitted until 2025-10-06 15:15:18.617175416 +0000 UTC m=+1315.772467928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data") pod "rabbitmq-cell1-server-0" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273") : configmap "rabbitmq-cell1-config-data" not found Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.128712 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.128947 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b7530761-b715-4178-8d58-5e1cd54838d0" containerName="ovn-northd" containerID="cri-o://8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8" gracePeriod=30 Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.129073 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b7530761-b715-4178-8d58-5e1cd54838d0" containerName="openstack-network-exporter" containerID="cri-o://27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950" gracePeriod=30 Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.170733 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron11f1-account-delete-gjtrg"] Oct 06 15:15:18 crc kubenswrapper[4763]: E1006 15:15:18.171100 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62b41ca-5e6d-4760-be62-af924b841737" containerName="openstackclient" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.171116 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62b41ca-5e6d-4760-be62-af924b841737" containerName="openstackclient" Oct 06 15:15:18 crc kubenswrapper[4763]: E1006 15:15:18.171133 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57c7e43-942b-418a-978c-87a2e535d430" containerName="collect-profiles" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.171139 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57c7e43-942b-418a-978c-87a2e535d430" containerName="collect-profiles" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.171329 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62b41ca-5e6d-4760-be62-af924b841737" containerName="openstackclient" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.171354 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57c7e43-942b-418a-978c-87a2e535d430" containerName="collect-profiles" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.171916 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron11f1-account-delete-gjtrg" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.206789 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron11f1-account-delete-gjtrg"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.216036 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf8mh\" (UniqueName: \"kubernetes.io/projected/aa1a60e3-da79-4605-8b0b-329ac33c07a9-kube-api-access-zf8mh\") pod \"neutron11f1-account-delete-gjtrg\" (UID: \"aa1a60e3-da79-4605-8b0b-329ac33c07a9\") " pod="openstack/neutron11f1-account-delete-gjtrg" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.230714 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement19ad-account-delete-xlzs4"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.231877 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement19ad-account-delete-xlzs4" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.287245 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement19ad-account-delete-xlzs4"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.322702 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6s2f\" (UniqueName: \"kubernetes.io/projected/aea53287-9722-47c6-a937-8a267b981e92-kube-api-access-d6s2f\") pod \"placement19ad-account-delete-xlzs4\" (UID: \"aea53287-9722-47c6-a937-8a267b981e92\") " pod="openstack/placement19ad-account-delete-xlzs4" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.322804 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf8mh\" (UniqueName: \"kubernetes.io/projected/aa1a60e3-da79-4605-8b0b-329ac33c07a9-kube-api-access-zf8mh\") pod \"neutron11f1-account-delete-gjtrg\" (UID: \"aa1a60e3-da79-4605-8b0b-329ac33c07a9\") " pod="openstack/neutron11f1-account-delete-gjtrg" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.388203 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-p7v6n"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.389543 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf8mh\" (UniqueName: \"kubernetes.io/projected/aa1a60e3-da79-4605-8b0b-329ac33c07a9-kube-api-access-zf8mh\") pod \"neutron11f1-account-delete-gjtrg\" (UID: \"aa1a60e3-da79-4605-8b0b-329ac33c07a9\") " pod="openstack/neutron11f1-account-delete-gjtrg" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.426634 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6s2f\" (UniqueName: \"kubernetes.io/projected/aea53287-9722-47c6-a937-8a267b981e92-kube-api-access-d6s2f\") pod \"placement19ad-account-delete-xlzs4\" (UID: \"aea53287-9722-47c6-a937-8a267b981e92\") " pod="openstack/placement19ad-account-delete-xlzs4" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.435493 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-p7v6n"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.463969 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6s2f\" (UniqueName: \"kubernetes.io/projected/aea53287-9722-47c6-a937-8a267b981e92-kube-api-access-d6s2f\") pod \"placement19ad-account-delete-xlzs4\" (UID: \"aea53287-9722-47c6-a937-8a267b981e92\") " pod="openstack/placement19ad-account-delete-xlzs4" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.467493 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glancebac8-account-delete-rw77w"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.468801 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancebac8-account-delete-rw77w" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.488074 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancebac8-account-delete-rw77w"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.502381 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder1178-account-delete-jjvjp"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.506962 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder1178-account-delete-jjvjp" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.512969 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ssv6p"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.533967 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron11f1-account-delete-gjtrg" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.548485 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ssv6p"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.554757 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5s67\" (UniqueName: \"kubernetes.io/projected/9fa9e5fe-caca-4b52-b66d-5869e1e67ab5-kube-api-access-s5s67\") pod \"cinder1178-account-delete-jjvjp\" (UID: \"9fa9e5fe-caca-4b52-b66d-5869e1e67ab5\") " pod="openstack/cinder1178-account-delete-jjvjp" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.554871 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxzfn\" (UniqueName: \"kubernetes.io/projected/5df38d69-816c-41c3-8de5-b270104ebb23-kube-api-access-rxzfn\") pod \"glancebac8-account-delete-rw77w\" (UID: \"5df38d69-816c-41c3-8de5-b270104ebb23\") " pod="openstack/glancebac8-account-delete-rw77w" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.587286 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.587845 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerName="openstack-network-exporter" containerID="cri-o://2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc" gracePeriod=300 Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.594228 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder1178-account-delete-jjvjp"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.620985 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement19ad-account-delete-xlzs4" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.655536 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.658049 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5s67\" (UniqueName: \"kubernetes.io/projected/9fa9e5fe-caca-4b52-b66d-5869e1e67ab5-kube-api-access-s5s67\") pod \"cinder1178-account-delete-jjvjp\" (UID: \"9fa9e5fe-caca-4b52-b66d-5869e1e67ab5\") " pod="openstack/cinder1178-account-delete-jjvjp" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.658153 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxzfn\" (UniqueName: \"kubernetes.io/projected/5df38d69-816c-41c3-8de5-b270104ebb23-kube-api-access-rxzfn\") pod \"glancebac8-account-delete-rw77w\" (UID: \"5df38d69-816c-41c3-8de5-b270104ebb23\") " pod="openstack/glancebac8-account-delete-rw77w" Oct 06 15:15:18 crc kubenswrapper[4763]: E1006 15:15:18.659117 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 06 15:15:18 crc kubenswrapper[4763]: E1006 15:15:18.659157 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data podName:2fad9bbe-33dc-4f1d-a156-52bbd3a69273 nodeName:}" failed. No retries permitted until 2025-10-06 15:15:19.659143805 +0000 UTC m=+1316.814436317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data") pod "rabbitmq-cell1-server-0" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273") : configmap "rabbitmq-cell1-config-data" not found Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.697309 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell1ca15-account-delete-fvfnr"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.698608 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1ca15-account-delete-fvfnr" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.702471 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5s67\" (UniqueName: \"kubernetes.io/projected/9fa9e5fe-caca-4b52-b66d-5869e1e67ab5-kube-api-access-s5s67\") pod \"cinder1178-account-delete-jjvjp\" (UID: \"9fa9e5fe-caca-4b52-b66d-5869e1e67ab5\") " pod="openstack/cinder1178-account-delete-jjvjp" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.741712 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell1ca15-account-delete-fvfnr"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.759252 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxzfn\" (UniqueName: \"kubernetes.io/projected/5df38d69-816c-41c3-8de5-b270104ebb23-kube-api-access-rxzfn\") pod \"glancebac8-account-delete-rw77w\" (UID: \"5df38d69-816c-41c3-8de5-b270104ebb23\") " pod="openstack/glancebac8-account-delete-rw77w" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.825511 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancebac8-account-delete-rw77w" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.843065 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder1178-account-delete-jjvjp" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.867106 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dj9c\" (UniqueName: \"kubernetes.io/projected/2235b0e6-860e-450c-b129-f0082e1670e1-kube-api-access-5dj9c\") pod \"novacell1ca15-account-delete-fvfnr\" (UID: \"2235b0e6-860e-450c-b129-f0082e1670e1\") " pod="openstack/novacell1ca15-account-delete-fvfnr" Oct 06 15:15:18 crc kubenswrapper[4763]: E1006 15:15:18.868471 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 06 15:15:18 crc kubenswrapper[4763]: E1006 15:15:18.868519 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data podName:5c83a4de-f6df-4d0e-9bd0-03cbcb877f43 nodeName:}" failed. No retries permitted until 2025-10-06 15:15:19.368503899 +0000 UTC m=+1316.523796411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data") pod "rabbitmq-server-0" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43") : configmap "rabbitmq-config-data" not found Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.899749 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-r2m7t"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.919652 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-r2m7t"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.925821 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vs2bl"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.934080 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vs2bl"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.942698 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-fpb82"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.948050 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-fpb82"] Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.972554 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dj9c\" (UniqueName: \"kubernetes.io/projected/2235b0e6-860e-450c-b129-f0082e1670e1-kube-api-access-5dj9c\") pod \"novacell1ca15-account-delete-fvfnr\" (UID: \"2235b0e6-860e-450c-b129-f0082e1670e1\") " pod="openstack/novacell1ca15-account-delete-fvfnr" Oct 06 15:15:18 crc kubenswrapper[4763]: I1006 15:15:18.992808 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-cf4dn"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.005457 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dj9c\" (UniqueName: \"kubernetes.io/projected/2235b0e6-860e-450c-b129-f0082e1670e1-kube-api-access-5dj9c\") pod \"novacell1ca15-account-delete-fvfnr\" (UID: \"2235b0e6-860e-450c-b129-f0082e1670e1\") " pod="openstack/novacell1ca15-account-delete-fvfnr" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.010458 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6cpk4"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.010728 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-6cpk4" podUID="40580e5d-8c54-477e-af15-1ba2cf5d3dc0" containerName="openstack-network-exporter" containerID="cri-o://8095d8ba556be14995d51b888d8ed4a97695bfea0a9a26d8a90c3415861649c3" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.019034 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lw4hs"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.022681 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerName="ovsdbserver-nb" containerID="cri-o://37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee" gracePeriod=300 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.041887 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1ca15-account-delete-fvfnr" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.115297 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b667cdf65-js4pw"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.138748 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b667cdf65-js4pw" podUID="93a939be-54f9-4483-b37c-57e6d5b04f0d" containerName="neutron-api" containerID="cri-o://e605b0aaa421d3c856e90bbcb0d9a8126bc1d053474a70e3d68b5174771747d5" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.139110 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b667cdf65-js4pw" podUID="93a939be-54f9-4483-b37c-57e6d5b04f0d" containerName="neutron-httpd" containerID="cri-o://0ff5b5a9d760f2d9ecf1a7c7038e9965e60f8bcb7447f920abe0567b0a54badf" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.154998 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wg4dh"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.203780 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wg4dh"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.214194 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-79c97876dd-6hbjr"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.214403 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-79c97876dd-6hbjr" podUID="c465d0a4-ce55-49ff-bdd4-62585989b25b" containerName="placement-log" containerID="cri-o://260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.215228 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-79c97876dd-6hbjr" podUID="c465d0a4-ce55-49ff-bdd4-62585989b25b" containerName="placement-api" containerID="cri-o://ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.316847 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6cpk4_40580e5d-8c54-477e-af15-1ba2cf5d3dc0/openstack-network-exporter/0.log" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.317067 4763 generic.go:334] "Generic (PLEG): container finished" podID="40580e5d-8c54-477e-af15-1ba2cf5d3dc0" containerID="8095d8ba556be14995d51b888d8ed4a97695bfea0a9a26d8a90c3415861649c3" exitCode=2 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.317118 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6cpk4" event={"ID":"40580e5d-8c54-477e-af15-1ba2cf5d3dc0","Type":"ContainerDied","Data":"8095d8ba556be14995d51b888d8ed4a97695bfea0a9a26d8a90c3415861649c3"} Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.332333 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1bcb31ee-4374-46aa-ab52-39f216f2bf67/ovsdbserver-nb/0.log" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.332376 4763 generic.go:334] "Generic (PLEG): container finished" podID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerID="2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc" exitCode=2 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.332441 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1bcb31ee-4374-46aa-ab52-39f216f2bf67","Type":"ContainerDied","Data":"2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc"} Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.334546 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-l75n6"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.346820 4763 generic.go:334] "Generic (PLEG): container finished" podID="b7530761-b715-4178-8d58-5e1cd54838d0" containerID="27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950" exitCode=2 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.346860 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b7530761-b715-4178-8d58-5e1cd54838d0","Type":"ContainerDied","Data":"27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950"} Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.358671 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-l75n6"] Oct 06 15:15:19 crc kubenswrapper[4763]: E1006 15:15:19.395085 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 06 15:15:19 crc kubenswrapper[4763]: E1006 15:15:19.395150 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data podName:5c83a4de-f6df-4d0e-9bd0-03cbcb877f43 nodeName:}" failed. No retries permitted until 2025-10-06 15:15:20.39513683 +0000 UTC m=+1317.550429342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data") pod "rabbitmq-server-0" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43") : configmap "rabbitmq-config-data" not found Oct 06 15:15:19 crc kubenswrapper[4763]: E1006 15:15:19.405272 4763 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-lw4hs" message="Exiting ovn-controller (1) " Oct 06 15:15:19 crc kubenswrapper[4763]: E1006 15:15:19.405313 4763 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-lw4hs" podUID="2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" containerName="ovn-controller" containerID="cri-o://3c4b01032b0edcf17554003538869b76d9315f50af3f11dcd288516a9dc13969" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.405551 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-lw4hs" podUID="2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" containerName="ovn-controller" containerID="cri-o://3c4b01032b0edcf17554003538869b76d9315f50af3f11dcd288516a9dc13969" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: E1006 15:15:19.436922 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee is running failed: container process not found" containerID="37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 06 15:15:19 crc kubenswrapper[4763]: E1006 15:15:19.437974 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee is running failed: container process not found" containerID="37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 06 15:15:19 crc kubenswrapper[4763]: E1006 15:15:19.438486 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee is running failed: container process not found" containerID="37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 06 15:15:19 crc kubenswrapper[4763]: E1006 15:15:19.438521 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerName="ovsdbserver-nb" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.511646 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.512377 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="b444a5f8-8311-488c-b612-2d44328edc52" containerName="openstack-network-exporter" containerID="cri-o://a14cb1fee4bc452ad0d1ead1b1070a9175132789cf27b82cbc5f083baa390272" gracePeriod=300 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.633006 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="b444a5f8-8311-488c-b612-2d44328edc52" containerName="ovsdbserver-sb" containerID="cri-o://5c5d1d55b187f82aaee42975506945a3d1714a7b414a61bf50e5ba10dc72666f" gracePeriod=300 Oct 06 15:15:19 crc kubenswrapper[4763]: E1006 15:15:19.663528 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.664256 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c92ac5-96c4-45f2-9fbf-3c43dd548dbb" path="/var/lib/kubelet/pods/29c92ac5-96c4-45f2-9fbf-3c43dd548dbb/volumes" Oct 06 15:15:19 crc kubenswrapper[4763]: E1006 15:15:19.664699 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data podName:2fad9bbe-33dc-4f1d-a156-52bbd3a69273 nodeName:}" failed. No retries permitted until 2025-10-06 15:15:21.66466984 +0000 UTC m=+1318.819962352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data") pod "rabbitmq-cell1-server-0" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273") : configmap "rabbitmq-cell1-config-data" not found Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.664931 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3209a935-b3c7-4cfd-961b-1a7550aa1f63" path="/var/lib/kubelet/pods/3209a935-b3c7-4cfd-961b-1a7550aa1f63/volumes" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.673772 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ec9a93-3d8c-49a0-8dcd-bb10f1270deb" path="/var/lib/kubelet/pods/48ec9a93-3d8c-49a0-8dcd-bb10f1270deb/volumes" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.674643 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e3da3e-5e14-48a9-bb24-cc1a0b6fadae" path="/var/lib/kubelet/pods/67e3da3e-5e14-48a9-bb24-cc1a0b6fadae/volumes" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.675310 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad00dc53-32d8-4edd-ab1c-e9467d8be9eb" path="/var/lib/kubelet/pods/ad00dc53-32d8-4edd-ab1c-e9467d8be9eb/volumes" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.678740 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5a00a8d-3928-4ece-9d1c-c3ca6993756b" path="/var/lib/kubelet/pods/c5a00a8d-3928-4ece-9d1c-c3ca6993756b/volumes" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.679360 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb17b8c1-c76b-4802-aac9-daaacea9e726" path="/var/lib/kubelet/pods/eb17b8c1-c76b-4802-aac9-daaacea9e726/volumes" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.680113 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8hvd4"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.737080 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8hvd4"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.759997 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-p54rq"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.760247 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" podUID="c65588a5-9e57-4d62-8abf-c0154251b6eb" containerName="dnsmasq-dns" containerID="cri-o://e558c915bfc1f5ce99283b28f4010dce7a4750b02b11e663509a4bc0669fd4a9" gracePeriod=10 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.775907 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.776333 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-server" containerID="cri-o://f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.776735 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="swift-recon-cron" containerID="cri-o://45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.776792 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="rsync" containerID="cri-o://2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.776824 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-expirer" containerID="cri-o://4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.776851 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-updater" containerID="cri-o://d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.776879 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-auditor" containerID="cri-o://cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.776907 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-replicator" containerID="cri-o://9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.776937 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-server" containerID="cri-o://f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.776965 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-updater" containerID="cri-o://ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.776992 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-auditor" containerID="cri-o://d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.777028 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-replicator" containerID="cri-o://bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.777058 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-server" containerID="cri-o://4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.777085 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-reaper" containerID="cri-o://1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.777112 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-auditor" containerID="cri-o://f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.777146 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-replicator" containerID="cri-o://04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.788670 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" podUID="c65588a5-9e57-4d62-8abf-c0154251b6eb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.196:5353: connect: connection refused" Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.816719 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.817056 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" containerName="glance-log" containerID="cri-o://8f6a767bdad431305d84e26950c1846ebe85fc9d498b6fa5277ed34f9be13eb0" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.817427 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" containerName="glance-httpd" containerID="cri-o://d1eb8ece3ed05071e8f0e183f59d758fa3e0c6899e555329438a62699f5c4813" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.860124 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.860392 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" containerName="cinder-scheduler" containerID="cri-o://8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.860807 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" containerName="probe" containerID="cri-o://23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.884389 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.884688 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="14a424ce-ef7d-4b9c-965e-b821798d3f78" containerName="glance-log" containerID="cri-o://aa4a777788599da63887c8bae5fa04a3d16d36136221168f67e902cc9b4ffdbc" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.884979 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="14a424ce-ef7d-4b9c-965e-b821798d3f78" containerName="glance-httpd" containerID="cri-o://08a1f33d966b3e95b4e43e2c4c9605f94fb70073e52950d5c8d680e81f2eeaa1" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.899314 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.899584 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerName="cinder-api-log" containerID="cri-o://33de3bfd1cb8b6e84aac16172cd6d4816d4bbcc99bd594f7601b9f6bae6bc708" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.900132 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerName="cinder-api" containerID="cri-o://6e23f0e5ee7363ef074a84ef0fdf33f87ae71f9dfadd96d75eeac46f6868a70c" gracePeriod=30 Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.973095 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-19ad-account-create-x69kg"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.986799 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-19ad-account-create-x69kg"] Oct 06 15:15:19 crc kubenswrapper[4763]: I1006 15:15:19.990566 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qrnms"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.014144 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qrnms"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.027939 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron11f1-account-delete-gjtrg"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.057390 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement19ad-account-delete-xlzs4"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.086599 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-11f1-account-create-xdj7v"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.089367 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-11f1-account-create-xdj7v"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.103833 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovs-vswitchd" containerID="cri-o://645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" gracePeriod=29 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.117944 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zmbmn"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.133259 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fkvpx"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.149411 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zmbmn"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.161693 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fkvpx"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.191778 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2flgn"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.213959 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2flgn"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.244748 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4bac-account-create-bk65b"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.267764 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4bac-account-create-bk65b"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.278417 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6cpk4_40580e5d-8c54-477e-af15-1ba2cf5d3dc0/openstack-network-exporter/0.log" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.278710 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.298836 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1bcb31ee-4374-46aa-ab52-39f216f2bf67/ovsdbserver-nb/0.log" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.298953 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.303147 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bac8-account-create-x9mm7"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.315794 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bac8-account-create-x9mm7"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.328159 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancebac8-account-delete-rw77w"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.340416 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.357131 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gngw9"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.388678 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gngw9"] Oct 06 15:15:20 crc kubenswrapper[4763]: E1006 15:15:20.397924 4763 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 06 15:15:20 crc kubenswrapper[4763]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 06 15:15:20 crc kubenswrapper[4763]: + source /usr/local/bin/container-scripts/functions Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNBridge=br-int Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNRemote=tcp:localhost:6642 Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNEncapType=geneve Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNAvailabilityZones= Oct 06 15:15:20 crc kubenswrapper[4763]: ++ EnableChassisAsGateway=true Oct 06 15:15:20 crc kubenswrapper[4763]: ++ PhysicalNetworks= Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNHostName= Oct 06 15:15:20 crc kubenswrapper[4763]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 06 15:15:20 crc kubenswrapper[4763]: ++ ovs_dir=/var/lib/openvswitch Oct 06 15:15:20 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 06 15:15:20 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 06 15:15:20 crc kubenswrapper[4763]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 15:15:20 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 15:15:20 crc kubenswrapper[4763]: + sleep 0.5 Oct 06 15:15:20 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 15:15:20 crc kubenswrapper[4763]: + sleep 0.5 Oct 06 15:15:20 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 15:15:20 crc kubenswrapper[4763]: + cleanup_ovsdb_server_semaphore Oct 06 15:15:20 crc kubenswrapper[4763]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 15:15:20 crc kubenswrapper[4763]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 06 15:15:20 crc kubenswrapper[4763]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-cf4dn" message=< Oct 06 15:15:20 crc kubenswrapper[4763]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 06 15:15:20 crc kubenswrapper[4763]: + source /usr/local/bin/container-scripts/functions Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNBridge=br-int Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNRemote=tcp:localhost:6642 Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNEncapType=geneve Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNAvailabilityZones= Oct 06 15:15:20 crc kubenswrapper[4763]: ++ EnableChassisAsGateway=true Oct 06 15:15:20 crc kubenswrapper[4763]: ++ PhysicalNetworks= Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNHostName= Oct 06 15:15:20 crc kubenswrapper[4763]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 06 15:15:20 crc kubenswrapper[4763]: ++ ovs_dir=/var/lib/openvswitch Oct 06 15:15:20 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 06 15:15:20 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 06 15:15:20 crc kubenswrapper[4763]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 15:15:20 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 15:15:20 crc kubenswrapper[4763]: + sleep 0.5 Oct 06 15:15:20 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 15:15:20 crc kubenswrapper[4763]: + sleep 0.5 Oct 06 15:15:20 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 15:15:20 crc kubenswrapper[4763]: + cleanup_ovsdb_server_semaphore Oct 06 15:15:20 crc kubenswrapper[4763]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 15:15:20 crc kubenswrapper[4763]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 06 15:15:20 crc kubenswrapper[4763]: > Oct 06 15:15:20 crc kubenswrapper[4763]: E1006 15:15:20.397977 4763 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 06 15:15:20 crc kubenswrapper[4763]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 06 15:15:20 crc kubenswrapper[4763]: + source /usr/local/bin/container-scripts/functions Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNBridge=br-int Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNRemote=tcp:localhost:6642 Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNEncapType=geneve Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNAvailabilityZones= Oct 06 15:15:20 crc kubenswrapper[4763]: ++ EnableChassisAsGateway=true Oct 06 15:15:20 crc kubenswrapper[4763]: ++ PhysicalNetworks= Oct 06 15:15:20 crc kubenswrapper[4763]: ++ OVNHostName= Oct 06 15:15:20 crc kubenswrapper[4763]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 06 15:15:20 crc kubenswrapper[4763]: ++ ovs_dir=/var/lib/openvswitch Oct 06 15:15:20 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 06 15:15:20 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 06 15:15:20 crc kubenswrapper[4763]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 15:15:20 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 15:15:20 crc kubenswrapper[4763]: + sleep 0.5 Oct 06 15:15:20 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 15:15:20 crc kubenswrapper[4763]: + sleep 0.5 Oct 06 15:15:20 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 15:15:20 crc kubenswrapper[4763]: + cleanup_ovsdb_server_semaphore Oct 06 15:15:20 crc kubenswrapper[4763]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 15:15:20 crc kubenswrapper[4763]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 06 15:15:20 crc kubenswrapper[4763]: > pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" containerID="cri-o://0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398018 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" containerID="cri-o://0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" gracePeriod=29 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398439 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xplq\" (UniqueName: \"kubernetes.io/projected/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-kube-api-access-8xplq\") pod \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398496 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-combined-ca-bundle\") pod \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398554 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7c4z\" (UniqueName: \"kubernetes.io/projected/1bcb31ee-4374-46aa-ab52-39f216f2bf67-kube-api-access-j7c4z\") pod \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398582 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovs-rundir\") pod \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398635 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-scripts\") pod \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398667 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-combined-ca-bundle\") pod \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398707 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-config\") pod \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398733 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdb-rundir\") pod \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398760 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-metrics-certs-tls-certs\") pod \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398812 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-metrics-certs-tls-certs\") pod \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398869 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-config\") pod \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398891 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398934 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdbserver-nb-tls-certs\") pod \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\" (UID: \"1bcb31ee-4374-46aa-ab52-39f216f2bf67\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.398949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovn-rundir\") pod \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\" (UID: \"40580e5d-8c54-477e-af15-1ba2cf5d3dc0\") " Oct 06 15:15:20 crc kubenswrapper[4763]: E1006 15:15:20.399343 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 06 15:15:20 crc kubenswrapper[4763]: E1006 15:15:20.399383 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data podName:5c83a4de-f6df-4d0e-9bd0-03cbcb877f43 nodeName:}" failed. No retries permitted until 2025-10-06 15:15:22.399370701 +0000 UTC m=+1319.554663203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data") pod "rabbitmq-server-0" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43") : configmap "rabbitmq-config-data" not found Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.400011 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "1bcb31ee-4374-46aa-ab52-39f216f2bf67" (UID: "1bcb31ee-4374-46aa-ab52-39f216f2bf67"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.400431 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-config" (OuterVolumeSpecName: "config") pod "1bcb31ee-4374-46aa-ab52-39f216f2bf67" (UID: "1bcb31ee-4374-46aa-ab52-39f216f2bf67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.407462 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-config" (OuterVolumeSpecName: "config") pod "40580e5d-8c54-477e-af15-1ba2cf5d3dc0" (UID: "40580e5d-8c54-477e-af15-1ba2cf5d3dc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.407718 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "40580e5d-8c54-477e-af15-1ba2cf5d3dc0" (UID: "40580e5d-8c54-477e-af15-1ba2cf5d3dc0"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.416482 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bcb31ee-4374-46aa-ab52-39f216f2bf67-kube-api-access-j7c4z" (OuterVolumeSpecName: "kube-api-access-j7c4z") pod "1bcb31ee-4374-46aa-ab52-39f216f2bf67" (UID: "1bcb31ee-4374-46aa-ab52-39f216f2bf67"). InnerVolumeSpecName "kube-api-access-j7c4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.409411 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-scripts" (OuterVolumeSpecName: "scripts") pod "1bcb31ee-4374-46aa-ab52-39f216f2bf67" (UID: "1bcb31ee-4374-46aa-ab52-39f216f2bf67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.409444 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "40580e5d-8c54-477e-af15-1ba2cf5d3dc0" (UID: "40580e5d-8c54-477e-af15-1ba2cf5d3dc0"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.426016 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1178-account-create-nk49j"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.438369 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1178-account-create-nk49j"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.439061 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b444a5f8-8311-488c-b612-2d44328edc52/ovsdbserver-sb/0.log" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.439093 4763 generic.go:334] "Generic (PLEG): container finished" podID="b444a5f8-8311-488c-b612-2d44328edc52" containerID="a14cb1fee4bc452ad0d1ead1b1070a9175132789cf27b82cbc5f083baa390272" exitCode=2 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.439109 4763 generic.go:334] "Generic (PLEG): container finished" podID="b444a5f8-8311-488c-b612-2d44328edc52" containerID="5c5d1d55b187f82aaee42975506945a3d1714a7b414a61bf50e5ba10dc72666f" exitCode=143 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.439142 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b444a5f8-8311-488c-b612-2d44328edc52","Type":"ContainerDied","Data":"a14cb1fee4bc452ad0d1ead1b1070a9175132789cf27b82cbc5f083baa390272"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.439161 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b444a5f8-8311-488c-b612-2d44328edc52","Type":"ContainerDied","Data":"5c5d1d55b187f82aaee42975506945a3d1714a7b414a61bf50e5ba10dc72666f"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.444192 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-kube-api-access-8xplq" (OuterVolumeSpecName: "kube-api-access-8xplq") pod "40580e5d-8c54-477e-af15-1ba2cf5d3dc0" (UID: "40580e5d-8c54-477e-af15-1ba2cf5d3dc0"). InnerVolumeSpecName "kube-api-access-8xplq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.444275 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "1bcb31ee-4374-46aa-ab52-39f216f2bf67" (UID: "1bcb31ee-4374-46aa-ab52-39f216f2bf67"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.444906 4763 generic.go:334] "Generic (PLEG): container finished" podID="c65588a5-9e57-4d62-8abf-c0154251b6eb" containerID="e558c915bfc1f5ce99283b28f4010dce7a4750b02b11e663509a4bc0669fd4a9" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.444979 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" event={"ID":"c65588a5-9e57-4d62-8abf-c0154251b6eb","Type":"ContainerDied","Data":"e558c915bfc1f5ce99283b28f4010dce7a4750b02b11e663509a4bc0669fd4a9"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.447530 4763 generic.go:334] "Generic (PLEG): container finished" podID="14a424ce-ef7d-4b9c-965e-b821798d3f78" containerID="aa4a777788599da63887c8bae5fa04a3d16d36136221168f67e902cc9b4ffdbc" exitCode=143 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.447566 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14a424ce-ef7d-4b9c-965e-b821798d3f78","Type":"ContainerDied","Data":"aa4a777788599da63887c8bae5fa04a3d16d36136221168f67e902cc9b4ffdbc"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.449198 4763 generic.go:334] "Generic (PLEG): container finished" podID="c465d0a4-ce55-49ff-bdd4-62585989b25b" containerID="260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3" exitCode=143 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.449254 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79c97876dd-6hbjr" event={"ID":"c465d0a4-ce55-49ff-bdd4-62585989b25b","Type":"ContainerDied","Data":"260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.456580 4763 generic.go:334] "Generic (PLEG): container finished" podID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" containerID="8f6a767bdad431305d84e26950c1846ebe85fc9d498b6fa5277ed34f9be13eb0" exitCode=143 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.456779 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d08ec27f-a0b7-4146-8378-8bfb3e460e05","Type":"ContainerDied","Data":"8f6a767bdad431305d84e26950c1846ebe85fc9d498b6fa5277ed34f9be13eb0"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.458845 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder1178-account-delete-jjvjp"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.467520 4763 generic.go:334] "Generic (PLEG): container finished" podID="2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" containerID="3c4b01032b0edcf17554003538869b76d9315f50af3f11dcd288516a9dc13969" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.467585 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lw4hs" event={"ID":"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7","Type":"ContainerDied","Data":"3c4b01032b0edcf17554003538869b76d9315f50af3f11dcd288516a9dc13969"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.483208 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2fad9bbe-33dc-4f1d-a156-52bbd3a69273" containerName="rabbitmq" containerID="cri-o://32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965" gracePeriod=604800 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.486341 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6cpk4_40580e5d-8c54-477e-af15-1ba2cf5d3dc0/openstack-network-exporter/0.log" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.486422 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6cpk4" event={"ID":"40580e5d-8c54-477e-af15-1ba2cf5d3dc0","Type":"ContainerDied","Data":"e0589c2b199d7a1eeef3ee81d2e36a477aad249ea3279f2222f2ef53b2fecd65"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.486459 4763 scope.go:117] "RemoveContainer" containerID="8095d8ba556be14995d51b888d8ed4a97695bfea0a9a26d8a90c3415861649c3" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.486572 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6cpk4" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.493328 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.500533 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.500572 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.500582 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.500593 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xplq\" (UniqueName: \"kubernetes.io/projected/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-kube-api-access-8xplq\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.500602 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7c4z\" (UniqueName: \"kubernetes.io/projected/1bcb31ee-4374-46aa-ab52-39f216f2bf67-kube-api-access-j7c4z\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.500682 4763 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.500696 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bcb31ee-4374-46aa-ab52-39f216f2bf67-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.500704 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.500713 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.508352 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bcb31ee-4374-46aa-ab52-39f216f2bf67" (UID: "1bcb31ee-4374-46aa-ab52-39f216f2bf67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509258 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509328 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509340 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509350 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509356 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509370 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509441 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509448 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509455 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509348 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509514 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509528 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509537 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509546 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509563 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509484 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509605 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509572 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509642 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509651 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509658 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509667 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509686 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509728 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509739 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509748 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.509756 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.525928 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-kz6ph"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.527034 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1bcb31ee-4374-46aa-ab52-39f216f2bf67/ovsdbserver-nb/0.log" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.527089 4763 generic.go:334] "Generic (PLEG): container finished" podID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerID="37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee" exitCode=143 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.527158 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1bcb31ee-4374-46aa-ab52-39f216f2bf67","Type":"ContainerDied","Data":"37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.527186 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1bcb31ee-4374-46aa-ab52-39f216f2bf67","Type":"ContainerDied","Data":"5d8e840ce558cd8c9947c823e5582e6d99fb4564e364740acfedd1b0f30b9743"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.527274 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.531469 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-kz6ph"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.536314 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3e42-account-create-fwrfq"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.541278 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3e42-account-create-fwrfq"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.547051 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.547572 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerName="nova-api-log" containerID="cri-o://4f656fbf184d6e7e7f0539b3b3bd12e503563cb2565c711ff5b955951ea221f9" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.548522 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerName="nova-api-api" containerID="cri-o://adcaf8c8d3f281041745bafa4ac6ec897a52c954a8e11f85b8df351c7b2de708" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.548917 4763 generic.go:334] "Generic (PLEG): container finished" podID="93a939be-54f9-4483-b37c-57e6d5b04f0d" containerID="0ff5b5a9d760f2d9ecf1a7c7038e9965e60f8bcb7447f920abe0567b0a54badf" exitCode=0 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.548932 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b667cdf65-js4pw" event={"ID":"93a939be-54f9-4483-b37c-57e6d5b04f0d","Type":"ContainerDied","Data":"0ff5b5a9d760f2d9ecf1a7c7038e9965e60f8bcb7447f920abe0567b0a54badf"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.554885 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-km4xb"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.557120 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-km4xb"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.561040 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40580e5d-8c54-477e-af15-1ba2cf5d3dc0" (UID: "40580e5d-8c54-477e-af15-1ba2cf5d3dc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.562735 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ee6c-account-create-mk2rc"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.566710 4763 generic.go:334] "Generic (PLEG): container finished" podID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerID="33de3bfd1cb8b6e84aac16172cd6d4816d4bbcc99bd594f7601b9f6bae6bc708" exitCode=143 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.566774 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20b750fb-21cc-4a04-ba58-bddcbc2161e7","Type":"ContainerDied","Data":"33de3bfd1cb8b6e84aac16172cd6d4816d4bbcc99bd594f7601b9f6bae6bc708"} Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.593504 4763 generic.go:334] "Generic (PLEG): container finished" podID="f62b41ca-5e6d-4760-be62-af924b841737" containerID="ba6cd5ea4c4214f0c3ffef878583fe94e05f8dc0d7cd4b6711ee7dd59935b9a6" exitCode=137 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.594234 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ee6c-account-create-mk2rc"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.605934 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.605963 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.634228 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "40580e5d-8c54-477e-af15-1ba2cf5d3dc0" (UID: "40580e5d-8c54-477e-af15-1ba2cf5d3dc0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.641638 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.652605 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1bcb31ee-4374-46aa-ab52-39f216f2bf67" (UID: "1bcb31ee-4374-46aa-ab52-39f216f2bf67"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.677679 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hf5qv"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.688845 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "1bcb31ee-4374-46aa-ab52-39f216f2bf67" (UID: "1bcb31ee-4374-46aa-ab52-39f216f2bf67"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.708758 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.708788 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.708800 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40580e5d-8c54-477e-af15-1ba2cf5d3dc0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.708809 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bcb31ee-4374-46aa-ab52-39f216f2bf67-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.711517 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hf5qv"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.723657 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1ca15-account-delete-fvfnr"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.728466 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ca15-account-create-x7kw9"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.739676 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ca15-account-create-x7kw9"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.747676 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.747969 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-log" containerID="cri-o://c218a29eb7d3663d26274e3363b8a572e07d93eb706bdfc4ab5ab9ee292f214b" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.748718 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-metadata" containerID="cri-o://3371eb2323b8c35c65358b46ed5e998f521b7d07b05bcf5e5f18ac2589423079" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.769396 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.781675 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5699bffc6b-r4hxp"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.781923 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" podUID="df7d4540-2c3e-45ad-88b7-544734bb0413" containerName="barbican-keystone-listener-log" containerID="cri-o://9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.782199 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" podUID="df7d4540-2c3e-45ad-88b7-544734bb0413" containerName="barbican-keystone-listener" containerID="cri-o://03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.794389 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="42d3e722-26a6-40fa-9762-7da59b0009b7" containerName="galera" containerID="cri-o://2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.799674 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-697995ff8c-7vbhx"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.799860 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-697995ff8c-7vbhx" podUID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" containerName="barbican-worker-log" containerID="cri-o://cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.800672 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-697995ff8c-7vbhx" podUID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" containerName="barbican-worker" containerID="cri-o://63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.812291 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lw4hs" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.817353 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.817569 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="447e0e13-620c-40bb-b13c-5f9e7d5bba4a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d62b06765156a7ea72321c0cb90516a0c884d6a312dd38effde07268e0b70bd4" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.824591 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-85496b9568-j6pjn"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.824855 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-85496b9568-j6pjn" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerName="barbican-api-log" containerID="cri-o://570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.824959 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-85496b9568-j6pjn" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerName="barbican-api" containerID="cri-o://4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.852554 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" containerName="rabbitmq" containerID="cri-o://9dd278deff6663bb1f284e72573e37f876ea35fa390fa6b2c8c631abd30f4c75" gracePeriod=604800 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.863456 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.869052 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.869259 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5c67adc5-b329-4832-a9e6-711a70d0021e" containerName="nova-scheduler-scheduler" containerID="cri-o://eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.893713 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.894056 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="bcb3dd44-b7c9-4653-930a-113565fccec1" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" gracePeriod=30 Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.906380 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qqmkx"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.909240 4763 scope.go:117] "RemoveContainer" containerID="2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911448 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-log-ovn\") pod \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911490 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-scripts\") pod \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911522 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-ovn-controller-tls-certs\") pod \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911544 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run\") pod \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911561 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-config\") pod \"c65588a5-9e57-4d62-8abf-c0154251b6eb\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911575 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-swift-storage-0\") pod \"c65588a5-9e57-4d62-8abf-c0154251b6eb\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911599 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-nb\") pod \"c65588a5-9e57-4d62-8abf-c0154251b6eb\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911691 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-svc\") pod \"c65588a5-9e57-4d62-8abf-c0154251b6eb\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911735 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjgkv\" (UniqueName: \"kubernetes.io/projected/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-kube-api-access-wjgkv\") pod \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911756 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-combined-ca-bundle\") pod \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911789 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-sb\") pod \"c65588a5-9e57-4d62-8abf-c0154251b6eb\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911819 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh7fk\" (UniqueName: \"kubernetes.io/projected/c65588a5-9e57-4d62-8abf-c0154251b6eb-kube-api-access-kh7fk\") pod \"c65588a5-9e57-4d62-8abf-c0154251b6eb\" (UID: \"c65588a5-9e57-4d62-8abf-c0154251b6eb\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.911838 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run-ovn\") pod \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\" (UID: \"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7\") " Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.912333 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" (UID: "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.912366 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run" (OuterVolumeSpecName: "var-run") pod "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" (UID: "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.913396 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b444a5f8-8311-488c-b612-2d44328edc52/ovsdbserver-sb/0.log" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.913508 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.916809 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" (UID: "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.923367 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-scripts" (OuterVolumeSpecName: "scripts") pod "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" (UID: "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.934544 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-kube-api-access-wjgkv" (OuterVolumeSpecName: "kube-api-access-wjgkv") pod "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" (UID: "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7"). InnerVolumeSpecName "kube-api-access-wjgkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.934581 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qqmkx"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.954325 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-94w6q"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.960977 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65588a5-9e57-4d62-8abf-c0154251b6eb-kube-api-access-kh7fk" (OuterVolumeSpecName: "kube-api-access-kh7fk") pod "c65588a5-9e57-4d62-8abf-c0154251b6eb" (UID: "c65588a5-9e57-4d62-8abf-c0154251b6eb"). InnerVolumeSpecName "kube-api-access-kh7fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.974399 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-94w6q"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.974987 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.997402 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 15:15:20 crc kubenswrapper[4763]: I1006 15:15:20.997627 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="45540131-5bd7-47c8-bab3-da9362ab3aa3" containerName="nova-cell1-conductor-conductor" containerID="cri-o://c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d" gracePeriod=30 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.010807 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron11f1-account-delete-gjtrg"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013186 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-combined-ca-bundle\") pod \"b444a5f8-8311-488c-b612-2d44328edc52\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013227 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"b444a5f8-8311-488c-b612-2d44328edc52\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013251 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-config\") pod \"b444a5f8-8311-488c-b612-2d44328edc52\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013274 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-ovsdbserver-sb-tls-certs\") pod \"b444a5f8-8311-488c-b612-2d44328edc52\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013294 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv8dw\" (UniqueName: \"kubernetes.io/projected/b444a5f8-8311-488c-b612-2d44328edc52-kube-api-access-wv8dw\") pod \"b444a5f8-8311-488c-b612-2d44328edc52\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013384 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-scripts\") pod \"b444a5f8-8311-488c-b612-2d44328edc52\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013407 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config-secret\") pod \"f62b41ca-5e6d-4760-be62-af924b841737\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013441 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-combined-ca-bundle\") pod \"f62b41ca-5e6d-4760-be62-af924b841737\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013476 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config\") pod \"f62b41ca-5e6d-4760-be62-af924b841737\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013507 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-metrics-certs-tls-certs\") pod \"b444a5f8-8311-488c-b612-2d44328edc52\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013533 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9mc9\" (UniqueName: \"kubernetes.io/projected/f62b41ca-5e6d-4760-be62-af924b841737-kube-api-access-c9mc9\") pod \"f62b41ca-5e6d-4760-be62-af924b841737\" (UID: \"f62b41ca-5e6d-4760-be62-af924b841737\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013553 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b444a5f8-8311-488c-b612-2d44328edc52-ovsdb-rundir\") pod \"b444a5f8-8311-488c-b612-2d44328edc52\" (UID: \"b444a5f8-8311-488c-b612-2d44328edc52\") " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013810 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013822 4763 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013831 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjgkv\" (UniqueName: \"kubernetes.io/projected/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-kube-api-access-wjgkv\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013841 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh7fk\" (UniqueName: \"kubernetes.io/projected/c65588a5-9e57-4d62-8abf-c0154251b6eb-kube-api-access-kh7fk\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013849 4763 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.013857 4763 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.014481 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b444a5f8-8311-488c-b612-2d44328edc52-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "b444a5f8-8311-488c-b612-2d44328edc52" (UID: "b444a5f8-8311-488c-b612-2d44328edc52"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.014932 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-config" (OuterVolumeSpecName: "config") pod "b444a5f8-8311-488c-b612-2d44328edc52" (UID: "b444a5f8-8311-488c-b612-2d44328edc52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.015020 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.017110 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "b444a5f8-8311-488c-b612-2d44328edc52" (UID: "b444a5f8-8311-488c-b612-2d44328edc52"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.017217 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.018497 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-scripts" (OuterVolumeSpecName: "scripts") pod "b444a5f8-8311-488c-b612-2d44328edc52" (UID: "b444a5f8-8311-488c-b612-2d44328edc52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.018637 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.018678 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="bcb3dd44-b7c9-4653-930a-113565fccec1" containerName="nova-cell0-conductor-conductor" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.018794 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder1178-account-delete-jjvjp"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.029794 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement19ad-account-delete-xlzs4"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.038974 4763 scope.go:117] "RemoveContainer" containerID="37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.040334 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b444a5f8-8311-488c-b612-2d44328edc52-kube-api-access-wv8dw" (OuterVolumeSpecName: "kube-api-access-wv8dw") pod "b444a5f8-8311-488c-b612-2d44328edc52" (UID: "b444a5f8-8311-488c-b612-2d44328edc52"). InnerVolumeSpecName "kube-api-access-wv8dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.056856 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62b41ca-5e6d-4760-be62-af924b841737-kube-api-access-c9mc9" (OuterVolumeSpecName: "kube-api-access-c9mc9") pod "f62b41ca-5e6d-4760-be62-af924b841737" (UID: "f62b41ca-5e6d-4760-be62-af924b841737"). InnerVolumeSpecName "kube-api-access-c9mc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.071716 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.076002 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.091020 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6cpk4"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.091649 4763 scope.go:117] "RemoveContainer" containerID="2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc" Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.093119 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc\": container with ID starting with 2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc not found: ID does not exist" containerID="2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.093149 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc"} err="failed to get container status \"2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc\": rpc error: code = NotFound desc = could not find container \"2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc\": container with ID starting with 2e2194aa4e83ab3b5a89d813fbaf63a0b67d07f515c64afa033ce0af743000bc not found: ID does not exist" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.093171 4763 scope.go:117] "RemoveContainer" containerID="37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee" Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.093714 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee\": container with ID starting with 37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee not found: ID does not exist" containerID="37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.093738 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee"} err="failed to get container status \"37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee\": rpc error: code = NotFound desc = could not find container \"37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee\": container with ID starting with 37d68ebfb9746b8f2e7cdc22a5f6f035e759dad6f15ffca494883f4a1bf21bee not found: ID does not exist" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.114030 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-6cpk4"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.119851 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9mc9\" (UniqueName: \"kubernetes.io/projected/f62b41ca-5e6d-4760-be62-af924b841737-kube-api-access-c9mc9\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.119876 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b444a5f8-8311-488c-b612-2d44328edc52-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.119904 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.119913 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.119923 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv8dw\" (UniqueName: \"kubernetes.io/projected/b444a5f8-8311-488c-b612-2d44328edc52-kube-api-access-wv8dw\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.119932 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b444a5f8-8311-488c-b612-2d44328edc52-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.148730 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancebac8-account-delete-rw77w"] Oct 06 15:15:21 crc kubenswrapper[4763]: W1006 15:15:21.177404 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df38d69_816c_41c3_8de5_b270104ebb23.slice/crio-9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4 WatchSource:0}: Error finding container 9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4: Status 404 returned error can't find the container with id 9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.200480 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5ffc68c745-rgzs7"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.200717 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5ffc68c745-rgzs7" podUID="0ff0676d-7674-42db-a71e-bba83d7e093e" containerName="proxy-httpd" containerID="cri-o://d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef" gracePeriod=30 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.200874 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5ffc68c745-rgzs7" podUID="0ff0676d-7674-42db-a71e-bba83d7e093e" containerName="proxy-server" containerID="cri-o://59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135" gracePeriod=30 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.225979 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1ca15-account-delete-fvfnr"] Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.302108 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f62b41ca-5e6d-4760-be62-af924b841737" (UID: "f62b41ca-5e6d-4760-be62-af924b841737"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.304487 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c65588a5-9e57-4d62-8abf-c0154251b6eb" (UID: "c65588a5-9e57-4d62-8abf-c0154251b6eb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.321744 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" (UID: "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.322888 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.322907 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.322917 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.338978 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c65588a5-9e57-4d62-8abf-c0154251b6eb" (UID: "c65588a5-9e57-4d62-8abf-c0154251b6eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.349926 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f62b41ca-5e6d-4760-be62-af924b841737" (UID: "f62b41ca-5e6d-4760-be62-af924b841737"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.355569 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b444a5f8-8311-488c-b612-2d44328edc52" (UID: "b444a5f8-8311-488c-b612-2d44328edc52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.382788 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c65588a5-9e57-4d62-8abf-c0154251b6eb" (UID: "c65588a5-9e57-4d62-8abf-c0154251b6eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.386199 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.424877 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.424907 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.424916 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.424925 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.424934 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.444982 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f62b41ca-5e6d-4760-be62-af924b841737" (UID: "f62b41ca-5e6d-4760-be62-af924b841737"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.455631 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" (UID: "2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.459045 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-config" (OuterVolumeSpecName: "config") pod "c65588a5-9e57-4d62-8abf-c0154251b6eb" (UID: "c65588a5-9e57-4d62-8abf-c0154251b6eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.468836 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b444a5f8-8311-488c-b612-2d44328edc52" (UID: "b444a5f8-8311-488c-b612-2d44328edc52"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.478408 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "b444a5f8-8311-488c-b612-2d44328edc52" (UID: "b444a5f8-8311-488c-b612-2d44328edc52"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.516754 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c65588a5-9e57-4d62-8abf-c0154251b6eb" (UID: "c65588a5-9e57-4d62-8abf-c0154251b6eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.531797 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.531828 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b444a5f8-8311-488c-b612-2d44328edc52-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.531838 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.531847 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.531883 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65588a5-9e57-4d62-8abf-c0154251b6eb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.531892 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f62b41ca-5e6d-4760-be62-af924b841737-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.589927 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f34f668-f8f5-4575-88b8-28af7c3c97c7" path="/var/lib/kubelet/pods/0f34f668-f8f5-4575-88b8-28af7c3c97c7/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.590410 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1561ea6e-3c44-48a1-a019-eb349a46c150" path="/var/lib/kubelet/pods/1561ea6e-3c44-48a1-a019-eb349a46c150/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.591003 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" path="/var/lib/kubelet/pods/1bcb31ee-4374-46aa-ab52-39f216f2bf67/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.591982 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25243974-29ff-4bcb-854c-a72b5901b0bf" path="/var/lib/kubelet/pods/25243974-29ff-4bcb-854c-a72b5901b0bf/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.592711 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="257597c0-e6c4-40b8-9eaa-590d9caeb136" path="/var/lib/kubelet/pods/257597c0-e6c4-40b8-9eaa-590d9caeb136/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.593202 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325c5548-0def-436c-9c96-326c91d9b06e" path="/var/lib/kubelet/pods/325c5548-0def-436c-9c96-326c91d9b06e/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.593705 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40580e5d-8c54-477e-af15-1ba2cf5d3dc0" path="/var/lib/kubelet/pods/40580e5d-8c54-477e-af15-1ba2cf5d3dc0/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.594780 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4755a125-256f-47bf-8ed7-10c4662f10b4" path="/var/lib/kubelet/pods/4755a125-256f-47bf-8ed7-10c4662f10b4/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.595226 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f76244e-f9eb-428e-bd75-92eb5c6204a8" path="/var/lib/kubelet/pods/4f76244e-f9eb-428e-bd75-92eb5c6204a8/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.595730 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d2f5e0-56bc-4feb-8ebb-8347184ebe3a" path="/var/lib/kubelet/pods/51d2f5e0-56bc-4feb-8ebb-8347184ebe3a/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.596857 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b629946-f324-44b2-a40f-fcef64ee4766" path="/var/lib/kubelet/pods/8b629946-f324-44b2-a40f-fcef64ee4766/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.597345 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d34e81-6b03-42b5-a903-9f42e5618133" path="/var/lib/kubelet/pods/97d34e81-6b03-42b5-a903-9f42e5618133/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.597879 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af23f79f-d083-4a30-81fb-8f2791aa17eb" path="/var/lib/kubelet/pods/af23f79f-d083-4a30-81fb-8f2791aa17eb/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.598374 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1727578-835e-40a5-b9f2-84fbfe40a92f" path="/var/lib/kubelet/pods/c1727578-835e-40a5-b9f2-84fbfe40a92f/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.600252 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d01055e0-1ab7-4f74-89b5-1227aa259394" path="/var/lib/kubelet/pods/d01055e0-1ab7-4f74-89b5-1227aa259394/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.600709 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d036ff61-b8cb-4cb5-a353-206c1306f39b" path="/var/lib/kubelet/pods/d036ff61-b8cb-4cb5-a353-206c1306f39b/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.601134 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34374ef-f967-4050-bca8-ec81d585ee6e" path="/var/lib/kubelet/pods/d34374ef-f967-4050-bca8-ec81d585ee6e/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.601597 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d61678-af2a-45c3-bf0e-3c5244a9390a" path="/var/lib/kubelet/pods/d9d61678-af2a-45c3-bf0e-3c5244a9390a/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.602981 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c9db67-1ae9-4eeb-8023-558bd1136960" path="/var/lib/kubelet/pods/e3c9db67-1ae9-4eeb-8023-558bd1136960/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.603422 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed100452-a1dc-4014-8f80-56a9ac00b198" path="/var/lib/kubelet/pods/ed100452-a1dc-4014-8f80-56a9ac00b198/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.603972 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62b41ca-5e6d-4760-be62-af924b841737" path="/var/lib/kubelet/pods/f62b41ca-5e6d-4760-be62-af924b841737/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.609304 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff78748e-fd0a-43e5-a9c5-35daa377da26" path="/var/lib/kubelet/pods/ff78748e-fd0a-43e5-a9c5-35daa377da26/volumes" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.626836 4763 generic.go:334] "Generic (PLEG): container finished" podID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" containerID="23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50" exitCode=0 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.627457 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa","Type":"ContainerDied","Data":"23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.636924 4763 generic.go:334] "Generic (PLEG): container finished" podID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" containerID="cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52" exitCode=143 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.636986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-697995ff8c-7vbhx" event={"ID":"0be9cfb2-36bd-45a7-8d15-1603cb76780a","Type":"ContainerDied","Data":"cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.641626 4763 generic.go:334] "Generic (PLEG): container finished" podID="df7d4540-2c3e-45ad-88b7-544734bb0413" containerID="9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2" exitCode=143 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.641688 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" event={"ID":"df7d4540-2c3e-45ad-88b7-544734bb0413","Type":"ContainerDied","Data":"9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.643493 4763 generic.go:334] "Generic (PLEG): container finished" podID="d14df013-8cb0-4f11-b69d-a52002788320" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" exitCode=0 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.643550 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cf4dn" event={"ID":"d14df013-8cb0-4f11-b69d-a52002788320","Type":"ContainerDied","Data":"0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.652039 4763 generic.go:334] "Generic (PLEG): container finished" podID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerID="570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263" exitCode=143 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.652086 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85496b9568-j6pjn" event={"ID":"7909e384-b1c8-476c-801d-8b60015ccdc4","Type":"ContainerDied","Data":"570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.653116 4763 generic.go:334] "Generic (PLEG): container finished" podID="9fa9e5fe-caca-4b52-b66d-5869e1e67ab5" containerID="50fdccbc02630ae2712de970e1336fbee2494416dfd9f27fc9dd20adae83b494" exitCode=0 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.653157 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder1178-account-delete-jjvjp" event={"ID":"9fa9e5fe-caca-4b52-b66d-5869e1e67ab5","Type":"ContainerDied","Data":"50fdccbc02630ae2712de970e1336fbee2494416dfd9f27fc9dd20adae83b494"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.653173 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder1178-account-delete-jjvjp" event={"ID":"9fa9e5fe-caca-4b52-b66d-5869e1e67ab5","Type":"ContainerStarted","Data":"fc42f0adb6fc8c3bbad4d9b014169d92a791aed0714dd86b9c2eb181e55a6304"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.669370 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" event={"ID":"c65588a5-9e57-4d62-8abf-c0154251b6eb","Type":"ContainerDied","Data":"6dcf2c16f1b1c07a5bb4103238b712e3a60a85f87ce83eb7109b7b986905ffae"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.669413 4763 scope.go:117] "RemoveContainer" containerID="e558c915bfc1f5ce99283b28f4010dce7a4750b02b11e663509a4bc0669fd4a9" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.669516 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-p54rq" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.682573 4763 generic.go:334] "Generic (PLEG): container finished" podID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerID="4f656fbf184d6e7e7f0539b3b3bd12e503563cb2565c711ff5b955951ea221f9" exitCode=143 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.688859 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5bebf8d6-16bb-4dcf-afac-6a1a55e81350","Type":"ContainerDied","Data":"4f656fbf184d6e7e7f0539b3b3bd12e503563cb2565c711ff5b955951ea221f9"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.691887 4763 generic.go:334] "Generic (PLEG): container finished" podID="447e0e13-620c-40bb-b13c-5f9e7d5bba4a" containerID="d62b06765156a7ea72321c0cb90516a0c884d6a312dd38effde07268e0b70bd4" exitCode=0 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.691981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"447e0e13-620c-40bb-b13c-5f9e7d5bba4a","Type":"ContainerDied","Data":"d62b06765156a7ea72321c0cb90516a0c884d6a312dd38effde07268e0b70bd4"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.693102 4763 generic.go:334] "Generic (PLEG): container finished" podID="aea53287-9722-47c6-a937-8a267b981e92" containerID="89e1d97f84ce3e350b74dfc3c0f4a2c2c39a40c3a849fcebb5d944bb8a94d3e6" exitCode=0 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.693187 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement19ad-account-delete-xlzs4" event={"ID":"aea53287-9722-47c6-a937-8a267b981e92","Type":"ContainerDied","Data":"89e1d97f84ce3e350b74dfc3c0f4a2c2c39a40c3a849fcebb5d944bb8a94d3e6"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.693214 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement19ad-account-delete-xlzs4" event={"ID":"aea53287-9722-47c6-a937-8a267b981e92","Type":"ContainerStarted","Data":"03c3a28074635734bfb19276301cb700620ce6b6d89eba4f2be71ec861678621"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.713804 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lw4hs" event={"ID":"2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7","Type":"ContainerDied","Data":"6a8fba37f509e3de46058858ddef56d825fa8ec520581906fca22fc7c3f26c4c"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.714052 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lw4hs" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.716270 4763 generic.go:334] "Generic (PLEG): container finished" podID="aa1a60e3-da79-4605-8b0b-329ac33c07a9" containerID="5cbe8e5bf4a3ca8937efa8f910df395238e5337627e20eabc0efab6c3054cef0" exitCode=0 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.716338 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron11f1-account-delete-gjtrg" event={"ID":"aa1a60e3-da79-4605-8b0b-329ac33c07a9","Type":"ContainerDied","Data":"5cbe8e5bf4a3ca8937efa8f910df395238e5337627e20eabc0efab6c3054cef0"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.716364 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron11f1-account-delete-gjtrg" event={"ID":"aa1a60e3-da79-4605-8b0b-329ac33c07a9","Type":"ContainerStarted","Data":"3b77dc27ed8a5d1959bc452e56bf775499c3a78dbb1fc20e70781ffa8d3ee9ac"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.720534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancebac8-account-delete-rw77w" event={"ID":"5df38d69-816c-41c3-8de5-b270104ebb23","Type":"ContainerStarted","Data":"9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4"} Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.733669 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.733722 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data podName:2fad9bbe-33dc-4f1d-a156-52bbd3a69273 nodeName:}" failed. No retries permitted until 2025-10-06 15:15:25.733708639 +0000 UTC m=+1322.889001151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data") pod "rabbitmq-cell1-server-0" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273") : configmap "rabbitmq-cell1-config-data" not found Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.733769 4763 generic.go:334] "Generic (PLEG): container finished" podID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerID="c218a29eb7d3663d26274e3363b8a572e07d93eb706bdfc4ab5ab9ee292f214b" exitCode=143 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.733956 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18b427d1-75e9-4b32-afeb-f895661ddbe1","Type":"ContainerDied","Data":"c218a29eb7d3663d26274e3363b8a572e07d93eb706bdfc4ab5ab9ee292f214b"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.742332 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1ca15-account-delete-fvfnr" event={"ID":"2235b0e6-860e-450c-b129-f0082e1670e1","Type":"ContainerStarted","Data":"a1922ef6722a84d4a6330874bda4dad5ff9dbf1b6f0adeac1a77237f804b80e4"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.746913 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b444a5f8-8311-488c-b612-2d44328edc52/ovsdbserver-sb/0.log" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.746993 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b444a5f8-8311-488c-b612-2d44328edc52","Type":"ContainerDied","Data":"073c0c732f18861a2e2528365b60aece181bc2842d2501b76dd7f08e706ec232"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.747089 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.758032 4763 generic.go:334] "Generic (PLEG): container finished" podID="0ff0676d-7674-42db-a71e-bba83d7e093e" containerID="d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef" exitCode=0 Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.758088 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffc68c745-rgzs7" event={"ID":"0ff0676d-7674-42db-a71e-bba83d7e093e","Type":"ContainerDied","Data":"d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef"} Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.759498 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.942585 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.944745 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.946178 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 06 15:15:21 crc kubenswrapper[4763]: E1006 15:15:21.946214 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b7530761-b715-4178-8d58-5e1cd54838d0" containerName="ovn-northd" Oct 06 15:15:21 crc kubenswrapper[4763]: I1006 15:15:21.954835 4763 scope.go:117] "RemoveContainer" containerID="75069cc9ad83a59a931dc57aa44f9682f57ca621f24924238094ae0af4cdfe13" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.020668 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.034630 4763 scope.go:117] "RemoveContainer" containerID="3c4b01032b0edcf17554003538869b76d9315f50af3f11dcd288516a9dc13969" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.042945 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.071880 4763 scope.go:117] "RemoveContainer" containerID="a14cb1fee4bc452ad0d1ead1b1070a9175132789cf27b82cbc5f083baa390272" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.075018 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-p54rq"] Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.083018 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-p54rq"] Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.093219 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.101896 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lw4hs"] Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.102011 4763 scope.go:117] "RemoveContainer" containerID="5c5d1d55b187f82aaee42975506945a3d1714a7b414a61bf50e5ba10dc72666f" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.127466 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lw4hs"] Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.164019 4763 scope.go:117] "RemoveContainer" containerID="ba6cd5ea4c4214f0c3ffef878583fe94e05f8dc0d7cd4b6711ee7dd59935b9a6" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.246723 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-config-data\") pod \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.247222 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-combined-ca-bundle\") pod \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.247252 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-nova-novncproxy-tls-certs\") pod \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.247308 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whrvl\" (UniqueName: \"kubernetes.io/projected/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-kube-api-access-whrvl\") pod \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.247517 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-vencrypt-tls-certs\") pod \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\" (UID: \"447e0e13-620c-40bb-b13c-5f9e7d5bba4a\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.253942 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-kube-api-access-whrvl" (OuterVolumeSpecName: "kube-api-access-whrvl") pod "447e0e13-620c-40bb-b13c-5f9e7d5bba4a" (UID: "447e0e13-620c-40bb-b13c-5f9e7d5bba4a"). InnerVolumeSpecName "kube-api-access-whrvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.346716 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "447e0e13-620c-40bb-b13c-5f9e7d5bba4a" (UID: "447e0e13-620c-40bb-b13c-5f9e7d5bba4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.352218 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.352247 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whrvl\" (UniqueName: \"kubernetes.io/projected/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-kube-api-access-whrvl\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.353517 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron11f1-account-delete-gjtrg" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.357133 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "447e0e13-620c-40bb-b13c-5f9e7d5bba4a" (UID: "447e0e13-620c-40bb-b13c-5f9e7d5bba4a"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.365150 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.367597 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-config-data" (OuterVolumeSpecName: "config-data") pod "447e0e13-620c-40bb-b13c-5f9e7d5bba4a" (UID: "447e0e13-620c-40bb-b13c-5f9e7d5bba4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453156 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-default\") pod \"42d3e722-26a6-40fa-9762-7da59b0009b7\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453218 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-generated\") pod \"42d3e722-26a6-40fa-9762-7da59b0009b7\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453322 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-combined-ca-bundle\") pod \"42d3e722-26a6-40fa-9762-7da59b0009b7\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453359 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf8mh\" (UniqueName: \"kubernetes.io/projected/aa1a60e3-da79-4605-8b0b-329ac33c07a9-kube-api-access-zf8mh\") pod \"aa1a60e3-da79-4605-8b0b-329ac33c07a9\" (UID: \"aa1a60e3-da79-4605-8b0b-329ac33c07a9\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453387 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"42d3e722-26a6-40fa-9762-7da59b0009b7\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453416 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-operator-scripts\") pod \"42d3e722-26a6-40fa-9762-7da59b0009b7\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453456 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-galera-tls-certs\") pod \"42d3e722-26a6-40fa-9762-7da59b0009b7\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453488 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-secrets\") pod \"42d3e722-26a6-40fa-9762-7da59b0009b7\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453509 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-kolla-config\") pod \"42d3e722-26a6-40fa-9762-7da59b0009b7\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453549 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzvjn\" (UniqueName: \"kubernetes.io/projected/42d3e722-26a6-40fa-9762-7da59b0009b7-kube-api-access-qzvjn\") pod \"42d3e722-26a6-40fa-9762-7da59b0009b7\" (UID: \"42d3e722-26a6-40fa-9762-7da59b0009b7\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453912 4763 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.453929 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: E1006 15:15:22.453985 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 06 15:15:22 crc kubenswrapper[4763]: E1006 15:15:22.454031 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data podName:5c83a4de-f6df-4d0e-9bd0-03cbcb877f43 nodeName:}" failed. No retries permitted until 2025-10-06 15:15:26.454015748 +0000 UTC m=+1323.609308260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data") pod "rabbitmq-server-0" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43") : configmap "rabbitmq-config-data" not found Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.455023 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "42d3e722-26a6-40fa-9762-7da59b0009b7" (UID: "42d3e722-26a6-40fa-9762-7da59b0009b7"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.456149 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "42d3e722-26a6-40fa-9762-7da59b0009b7" (UID: "42d3e722-26a6-40fa-9762-7da59b0009b7"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.456181 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "42d3e722-26a6-40fa-9762-7da59b0009b7" (UID: "42d3e722-26a6-40fa-9762-7da59b0009b7"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.456810 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42d3e722-26a6-40fa-9762-7da59b0009b7" (UID: "42d3e722-26a6-40fa-9762-7da59b0009b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.458570 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-secrets" (OuterVolumeSpecName: "secrets") pod "42d3e722-26a6-40fa-9762-7da59b0009b7" (UID: "42d3e722-26a6-40fa-9762-7da59b0009b7"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.459705 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d3e722-26a6-40fa-9762-7da59b0009b7-kube-api-access-qzvjn" (OuterVolumeSpecName: "kube-api-access-qzvjn") pod "42d3e722-26a6-40fa-9762-7da59b0009b7" (UID: "42d3e722-26a6-40fa-9762-7da59b0009b7"). InnerVolumeSpecName "kube-api-access-qzvjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.460265 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa1a60e3-da79-4605-8b0b-329ac33c07a9-kube-api-access-zf8mh" (OuterVolumeSpecName: "kube-api-access-zf8mh") pod "aa1a60e3-da79-4605-8b0b-329ac33c07a9" (UID: "aa1a60e3-da79-4605-8b0b-329ac33c07a9"). InnerVolumeSpecName "kube-api-access-zf8mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.461161 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "447e0e13-620c-40bb-b13c-5f9e7d5bba4a" (UID: "447e0e13-620c-40bb-b13c-5f9e7d5bba4a"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.466458 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "42d3e722-26a6-40fa-9762-7da59b0009b7" (UID: "42d3e722-26a6-40fa-9762-7da59b0009b7"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.483176 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42d3e722-26a6-40fa-9762-7da59b0009b7" (UID: "42d3e722-26a6-40fa-9762-7da59b0009b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.561143 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.561178 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf8mh\" (UniqueName: \"kubernetes.io/projected/aa1a60e3-da79-4605-8b0b-329ac33c07a9-kube-api-access-zf8mh\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.561209 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.561220 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.561231 4763 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.561240 4763 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.561255 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzvjn\" (UniqueName: \"kubernetes.io/projected/42d3e722-26a6-40fa-9762-7da59b0009b7-kube-api-access-qzvjn\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.561264 4763 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/447e0e13-620c-40bb-b13c-5f9e7d5bba4a-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.561274 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.561283 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/42d3e722-26a6-40fa-9762-7da59b0009b7-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.577799 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.590270 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder1178-account-delete-jjvjp" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.594450 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement19ad-account-delete-xlzs4" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.609762 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "42d3e722-26a6-40fa-9762-7da59b0009b7" (UID: "42d3e722-26a6-40fa-9762-7da59b0009b7"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.635746 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.662462 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.662489 4763 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d3e722-26a6-40fa-9762-7da59b0009b7-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.729286 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765368 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-internal-tls-certs\") pod \"0ff0676d-7674-42db-a71e-bba83d7e093e\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765408 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5s67\" (UniqueName: \"kubernetes.io/projected/9fa9e5fe-caca-4b52-b66d-5869e1e67ab5-kube-api-access-s5s67\") pod \"9fa9e5fe-caca-4b52-b66d-5869e1e67ab5\" (UID: \"9fa9e5fe-caca-4b52-b66d-5869e1e67ab5\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765438 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-combined-ca-bundle\") pod \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765457 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-combined-ca-bundle\") pod \"0ff0676d-7674-42db-a71e-bba83d7e093e\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765485 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data-custom\") pod \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765508 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-config-data\") pod \"0ff0676d-7674-42db-a71e-bba83d7e093e\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765549 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-public-tls-certs\") pod \"0ff0676d-7674-42db-a71e-bba83d7e093e\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765566 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data\") pod \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765581 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be9cfb2-36bd-45a7-8d15-1603cb76780a-logs\") pod \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765628 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmknc\" (UniqueName: \"kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-kube-api-access-kmknc\") pod \"0ff0676d-7674-42db-a71e-bba83d7e093e\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765647 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7hrr\" (UniqueName: \"kubernetes.io/projected/0be9cfb2-36bd-45a7-8d15-1603cb76780a-kube-api-access-s7hrr\") pod \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\" (UID: \"0be9cfb2-36bd-45a7-8d15-1603cb76780a\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765664 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-run-httpd\") pod \"0ff0676d-7674-42db-a71e-bba83d7e093e\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765678 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-log-httpd\") pod \"0ff0676d-7674-42db-a71e-bba83d7e093e\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765713 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6s2f\" (UniqueName: \"kubernetes.io/projected/aea53287-9722-47c6-a937-8a267b981e92-kube-api-access-d6s2f\") pod \"aea53287-9722-47c6-a937-8a267b981e92\" (UID: \"aea53287-9722-47c6-a937-8a267b981e92\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.765728 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-etc-swift\") pod \"0ff0676d-7674-42db-a71e-bba83d7e093e\" (UID: \"0ff0676d-7674-42db-a71e-bba83d7e093e\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.774896 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0ff0676d-7674-42db-a71e-bba83d7e093e" (UID: "0ff0676d-7674-42db-a71e-bba83d7e093e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.775381 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be9cfb2-36bd-45a7-8d15-1603cb76780a-logs" (OuterVolumeSpecName: "logs") pod "0be9cfb2-36bd-45a7-8d15-1603cb76780a" (UID: "0be9cfb2-36bd-45a7-8d15-1603cb76780a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.776180 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0ff0676d-7674-42db-a71e-bba83d7e093e" (UID: "0ff0676d-7674-42db-a71e-bba83d7e093e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.780773 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-kube-api-access-kmknc" (OuterVolumeSpecName: "kube-api-access-kmknc") pod "0ff0676d-7674-42db-a71e-bba83d7e093e" (UID: "0ff0676d-7674-42db-a71e-bba83d7e093e"). InnerVolumeSpecName "kube-api-access-kmknc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.780856 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0ff0676d-7674-42db-a71e-bba83d7e093e" (UID: "0ff0676d-7674-42db-a71e-bba83d7e093e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.797746 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement19ad-account-delete-xlzs4" event={"ID":"aea53287-9722-47c6-a937-8a267b981e92","Type":"ContainerDied","Data":"03c3a28074635734bfb19276301cb700620ce6b6d89eba4f2be71ec861678621"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.804451 4763 scope.go:117] "RemoveContainer" containerID="89e1d97f84ce3e350b74dfc3c0f4a2c2c39a40c3a849fcebb5d944bb8a94d3e6" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.803305 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.799538 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement19ad-account-delete-xlzs4" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.809858 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa9e5fe-caca-4b52-b66d-5869e1e67ab5-kube-api-access-s5s67" (OuterVolumeSpecName: "kube-api-access-s5s67") pod "9fa9e5fe-caca-4b52-b66d-5869e1e67ab5" (UID: "9fa9e5fe-caca-4b52-b66d-5869e1e67ab5"). InnerVolumeSpecName "kube-api-access-s5s67". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.810019 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be9cfb2-36bd-45a7-8d15-1603cb76780a-kube-api-access-s7hrr" (OuterVolumeSpecName: "kube-api-access-s7hrr") pod "0be9cfb2-36bd-45a7-8d15-1603cb76780a" (UID: "0be9cfb2-36bd-45a7-8d15-1603cb76780a"). InnerVolumeSpecName "kube-api-access-s7hrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.811292 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0be9cfb2-36bd-45a7-8d15-1603cb76780a" (UID: "0be9cfb2-36bd-45a7-8d15-1603cb76780a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.811774 4763 generic.go:334] "Generic (PLEG): container finished" podID="5df38d69-816c-41c3-8de5-b270104ebb23" containerID="580fe997edd3a3f7db611460136856b9add2321662e07511e33a88c27f7fb783" exitCode=0 Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.811914 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancebac8-account-delete-rw77w" event={"ID":"5df38d69-816c-41c3-8de5-b270104ebb23","Type":"ContainerDied","Data":"580fe997edd3a3f7db611460136856b9add2321662e07511e33a88c27f7fb783"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.813405 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea53287-9722-47c6-a937-8a267b981e92-kube-api-access-d6s2f" (OuterVolumeSpecName: "kube-api-access-d6s2f") pod "aea53287-9722-47c6-a937-8a267b981e92" (UID: "aea53287-9722-47c6-a937-8a267b981e92"). InnerVolumeSpecName "kube-api-access-d6s2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.819077 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"447e0e13-620c-40bb-b13c-5f9e7d5bba4a","Type":"ContainerDied","Data":"c83072254b881ffc43761eb1e090e9f3dc42f88169433338fbe36a374281ed7b"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.819152 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.849713 4763 scope.go:117] "RemoveContainer" containerID="d62b06765156a7ea72321c0cb90516a0c884d6a312dd38effde07268e0b70bd4" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.860945 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0ff0676d-7674-42db-a71e-bba83d7e093e" (UID: "0ff0676d-7674-42db-a71e-bba83d7e093e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.861579 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron11f1-account-delete-gjtrg" event={"ID":"aa1a60e3-da79-4605-8b0b-329ac33c07a9","Type":"ContainerDied","Data":"3b77dc27ed8a5d1959bc452e56bf775499c3a78dbb1fc20e70781ffa8d3ee9ac"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.861848 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron11f1-account-delete-gjtrg" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.865158 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0ff0676d-7674-42db-a71e-bba83d7e093e" (UID: "0ff0676d-7674-42db-a71e-bba83d7e093e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.868976 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869002 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869042 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be9cfb2-36bd-45a7-8d15-1603cb76780a-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869052 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmknc\" (UniqueName: \"kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-kube-api-access-kmknc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869062 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7hrr\" (UniqueName: \"kubernetes.io/projected/0be9cfb2-36bd-45a7-8d15-1603cb76780a-kube-api-access-s7hrr\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869070 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869079 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ff0676d-7674-42db-a71e-bba83d7e093e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869115 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6s2f\" (UniqueName: \"kubernetes.io/projected/aea53287-9722-47c6-a937-8a267b981e92-kube-api-access-d6s2f\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869125 4763 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0ff0676d-7674-42db-a71e-bba83d7e093e-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869135 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869145 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5s67\" (UniqueName: \"kubernetes.io/projected/9fa9e5fe-caca-4b52-b66d-5869e1e67ab5-kube-api-access-s5s67\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869594 4763 generic.go:334] "Generic (PLEG): container finished" podID="df7d4540-2c3e-45ad-88b7-544734bb0413" containerID="03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225" exitCode=0 Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869672 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" event={"ID":"df7d4540-2c3e-45ad-88b7-544734bb0413","Type":"ContainerDied","Data":"03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869732 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" event={"ID":"df7d4540-2c3e-45ad-88b7-544734bb0413","Type":"ContainerDied","Data":"1925d66496eee672542fba5ca3de281b8471886dd5929a1a34819c22f33db164"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.869780 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5699bffc6b-r4hxp" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.879949 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0be9cfb2-36bd-45a7-8d15-1603cb76780a" (UID: "0be9cfb2-36bd-45a7-8d15-1603cb76780a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.882564 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-config-data" (OuterVolumeSpecName: "config-data") pod "0ff0676d-7674-42db-a71e-bba83d7e093e" (UID: "0ff0676d-7674-42db-a71e-bba83d7e093e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.889021 4763 generic.go:334] "Generic (PLEG): container finished" podID="0ff0676d-7674-42db-a71e-bba83d7e093e" containerID="59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135" exitCode=0 Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.890954 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ffc68c745-rgzs7" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.890979 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffc68c745-rgzs7" event={"ID":"0ff0676d-7674-42db-a71e-bba83d7e093e","Type":"ContainerDied","Data":"59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.891641 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffc68c745-rgzs7" event={"ID":"0ff0676d-7674-42db-a71e-bba83d7e093e","Type":"ContainerDied","Data":"169b16272c345969b09cd477d7f57f66ee6b66d73d67b1a449ca18b63b071d1e"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.900823 4763 generic.go:334] "Generic (PLEG): container finished" podID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" containerID="63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2" exitCode=0 Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.900878 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-697995ff8c-7vbhx" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.900886 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-697995ff8c-7vbhx" event={"ID":"0be9cfb2-36bd-45a7-8d15-1603cb76780a","Type":"ContainerDied","Data":"63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.900911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-697995ff8c-7vbhx" event={"ID":"0be9cfb2-36bd-45a7-8d15-1603cb76780a","Type":"ContainerDied","Data":"44c382bd4e7106e7b4e9eb034e5eb7c1f99b5a39005e626cd536a3da60ded40b"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.912724 4763 scope.go:117] "RemoveContainer" containerID="5cbe8e5bf4a3ca8937efa8f910df395238e5337627e20eabc0efab6c3054cef0" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.914030 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ff0676d-7674-42db-a71e-bba83d7e093e" (UID: "0ff0676d-7674-42db-a71e-bba83d7e093e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.918968 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data" (OuterVolumeSpecName: "config-data") pod "0be9cfb2-36bd-45a7-8d15-1603cb76780a" (UID: "0be9cfb2-36bd-45a7-8d15-1603cb76780a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.920311 4763 generic.go:334] "Generic (PLEG): container finished" podID="42d3e722-26a6-40fa-9762-7da59b0009b7" containerID="2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71" exitCode=0 Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.920448 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"42d3e722-26a6-40fa-9762-7da59b0009b7","Type":"ContainerDied","Data":"2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.920589 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"42d3e722-26a6-40fa-9762-7da59b0009b7","Type":"ContainerDied","Data":"4bc4599d1c35e74df80d583142c8049f1b02bafc4828e8e2648a2835e96dd8ad"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.920462 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.927596 4763 generic.go:334] "Generic (PLEG): container finished" podID="2235b0e6-860e-450c-b129-f0082e1670e1" containerID="bc021661ad332a99756df1e9782c3d0c921c860c5720b780cd5afc97f614afeb" exitCode=1 Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.927748 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1ca15-account-delete-fvfnr" event={"ID":"2235b0e6-860e-450c-b129-f0082e1670e1","Type":"ContainerDied","Data":"bc021661ad332a99756df1e9782c3d0c921c860c5720b780cd5afc97f614afeb"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.935977 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder1178-account-delete-jjvjp" event={"ID":"9fa9e5fe-caca-4b52-b66d-5869e1e67ab5","Type":"ContainerDied","Data":"fc42f0adb6fc8c3bbad4d9b014169d92a791aed0714dd86b9c2eb181e55a6304"} Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.936071 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder1178-account-delete-jjvjp" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.940593 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.944847 4763 scope.go:117] "RemoveContainer" containerID="03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.969667 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-combined-ca-bundle\") pod \"df7d4540-2c3e-45ad-88b7-544734bb0413\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.969826 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw625\" (UniqueName: \"kubernetes.io/projected/df7d4540-2c3e-45ad-88b7-544734bb0413-kube-api-access-sw625\") pod \"df7d4540-2c3e-45ad-88b7-544734bb0413\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.970078 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data\") pod \"df7d4540-2c3e-45ad-88b7-544734bb0413\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.970118 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data-custom\") pod \"df7d4540-2c3e-45ad-88b7-544734bb0413\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.970225 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7d4540-2c3e-45ad-88b7-544734bb0413-logs\") pod \"df7d4540-2c3e-45ad-88b7-544734bb0413\" (UID: \"df7d4540-2c3e-45ad-88b7-544734bb0413\") " Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.970696 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.970711 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.970723 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff0676d-7674-42db-a71e-bba83d7e093e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.970734 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be9cfb2-36bd-45a7-8d15-1603cb76780a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.971443 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7d4540-2c3e-45ad-88b7-544734bb0413-logs" (OuterVolumeSpecName: "logs") pod "df7d4540-2c3e-45ad-88b7-544734bb0413" (UID: "df7d4540-2c3e-45ad-88b7-544734bb0413"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.971479 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.982324 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df7d4540-2c3e-45ad-88b7-544734bb0413" (UID: "df7d4540-2c3e-45ad-88b7-544734bb0413"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.982502 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7d4540-2c3e-45ad-88b7-544734bb0413-kube-api-access-sw625" (OuterVolumeSpecName: "kube-api-access-sw625") pod "df7d4540-2c3e-45ad-88b7-544734bb0413" (UID: "df7d4540-2c3e-45ad-88b7-544734bb0413"). InnerVolumeSpecName "kube-api-access-sw625". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:22 crc kubenswrapper[4763]: I1006 15:15:22.988492 4763 scope.go:117] "RemoveContainer" containerID="9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.001958 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df7d4540-2c3e-45ad-88b7-544734bb0413" (UID: "df7d4540-2c3e-45ad-88b7-544734bb0413"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.006995 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron11f1-account-delete-gjtrg"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.022141 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron11f1-account-delete-gjtrg"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.038410 4763 scope.go:117] "RemoveContainer" containerID="03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.038904 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225\": container with ID starting with 03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225 not found: ID does not exist" containerID="03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.038933 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225"} err="failed to get container status \"03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225\": rpc error: code = NotFound desc = could not find container \"03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225\": container with ID starting with 03e2aa3a718dcb226cabc33c3c5517d0ae0424d6c93ab3cce643389801253225 not found: ID does not exist" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.038957 4763 scope.go:117] "RemoveContainer" containerID="9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.040193 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2\": container with ID starting with 9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2 not found: ID does not exist" containerID="9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.040232 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2"} err="failed to get container status \"9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2\": rpc error: code = NotFound desc = could not find container \"9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2\": container with ID starting with 9997d2b4c0ca404844d8962b20f4c78e6c27c151aa84c69ad87d5c60d23957e2 not found: ID does not exist" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.040246 4763 scope.go:117] "RemoveContainer" containerID="59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.054997 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.070756 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.071906 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7d4540-2c3e-45ad-88b7-544734bb0413-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.071924 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.071933 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw625\" (UniqueName: \"kubernetes.io/projected/df7d4540-2c3e-45ad-88b7-544734bb0413-kube-api-access-sw625\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.071943 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.072574 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data" (OuterVolumeSpecName: "config-data") pod "df7d4540-2c3e-45ad-88b7-544734bb0413" (UID: "df7d4540-2c3e-45ad-88b7-544734bb0413"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.076684 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder1178-account-delete-jjvjp"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.081432 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder1178-account-delete-jjvjp"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.091813 4763 scope.go:117] "RemoveContainer" containerID="d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.111215 4763 scope.go:117] "RemoveContainer" containerID="59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.111605 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135\": container with ID starting with 59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135 not found: ID does not exist" containerID="59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.111654 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135"} err="failed to get container status \"59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135\": rpc error: code = NotFound desc = could not find container \"59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135\": container with ID starting with 59db3ccefe999831fdf82b4c65ef73e69fcf15584d56c886ab5d16c4dc714135 not found: ID does not exist" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.111674 4763 scope.go:117] "RemoveContainer" containerID="d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.111861 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef\": container with ID starting with d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef not found: ID does not exist" containerID="d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.111886 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef"} err="failed to get container status \"d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef\": rpc error: code = NotFound desc = could not find container \"d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef\": container with ID starting with d3626e95a73e5b6147b25547731a6ebd13fccfa89d767022bcc01d87902a9eef not found: ID does not exist" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.111899 4763 scope.go:117] "RemoveContainer" containerID="63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.141929 4763 scope.go:117] "RemoveContainer" containerID="cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.142051 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement19ad-account-delete-xlzs4"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.142572 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement19ad-account-delete-xlzs4"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.174165 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7d4540-2c3e-45ad-88b7-544734bb0413-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.195429 4763 scope.go:117] "RemoveContainer" containerID="63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.205194 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2\": container with ID starting with 63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2 not found: ID does not exist" containerID="63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.205267 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2"} err="failed to get container status \"63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2\": rpc error: code = NotFound desc = could not find container \"63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2\": container with ID starting with 63ed0707dae30a6c00602bd266dd8c1850dfc6da3523a78230554dbb488434b2 not found: ID does not exist" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.205294 4763 scope.go:117] "RemoveContainer" containerID="cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.205795 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52\": container with ID starting with cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52 not found: ID does not exist" containerID="cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.205826 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52"} err="failed to get container status \"cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52\": rpc error: code = NotFound desc = could not find container \"cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52\": container with ID starting with cdc448cc526095c53261817e46f9b6c89d3150cd8f77a5ac275a4424f59cfc52 not found: ID does not exist" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.205850 4763 scope.go:117] "RemoveContainer" containerID="2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.220851 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.222124 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="ceilometer-central-agent" containerID="cri-o://08f39e51da65908a2dcee840d82e7c6b6fa5d8d45e73728b189ebdf1b5b89b0d" gracePeriod=30 Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.222514 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="proxy-httpd" containerID="cri-o://c09071984a4aec419c3daa2a55dbf3605ca256c2a5199a60850abcd7fd764271" gracePeriod=30 Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.222549 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="ceilometer-notification-agent" containerID="cri-o://c4dfdcfa2f39a9eade765b9d6b125e93d688c2a02a76e0cac1864c972b1b1c39" gracePeriod=30 Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.222692 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="sg-core" containerID="cri-o://b7afa41ebd51bd83378246bbfccfbc1dec4a81fdcfd2d71bd611a3acb01ef475" gracePeriod=30 Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.255940 4763 scope.go:117] "RemoveContainer" containerID="8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.258912 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.259107 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="0c33271c-af2f-43e4-adaf-9a81ef747ee5" containerName="kube-state-metrics" containerID="cri-o://b5a74a7863d471d5ebd8edc913311b4b8759f4a34d7919b70ef31a57831def5c" gracePeriod=30 Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.265071 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.281033 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.281693 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5699bffc6b-r4hxp"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.293552 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5699bffc6b-r4hxp"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.320773 4763 scope.go:117] "RemoveContainer" containerID="2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.320869 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.320906 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="45540131-5bd7-47c8-bab3-da9362ab3aa3" containerName="nova-cell1-conductor-conductor" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.333776 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71\": container with ID starting with 2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71 not found: ID does not exist" containerID="2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.333819 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71"} err="failed to get container status \"2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71\": rpc error: code = NotFound desc = could not find container \"2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71\": container with ID starting with 2fed035834b65a3c2bce38561a5ae620da9f4cc81c27b964bcbf2f041570fd71 not found: ID does not exist" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.333847 4763 scope.go:117] "RemoveContainer" containerID="8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.339061 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancebac8-account-delete-rw77w" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.339141 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e\": container with ID starting with 8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e not found: ID does not exist" containerID="8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.339163 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e"} err="failed to get container status \"8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e\": rpc error: code = NotFound desc = could not find container \"8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e\": container with ID starting with 8f67fef0b441518c5d49b1bb239609d30a26c3ac4bfdde46dbbce780a945650e not found: ID does not exist" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.339185 4763 scope.go:117] "RemoveContainer" containerID="50fdccbc02630ae2712de970e1336fbee2494416dfd9f27fc9dd20adae83b494" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.370656 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5ffc68c745-rgzs7"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.434645 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5ffc68c745-rgzs7"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.452694 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.452924 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="95acc4bd-d14c-4204-b20e-36085edffb73" containerName="memcached" containerID="cri-o://bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d" gracePeriod=30 Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.485281 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxzfn\" (UniqueName: \"kubernetes.io/projected/5df38d69-816c-41c3-8de5-b270104ebb23-kube-api-access-rxzfn\") pod \"5df38d69-816c-41c3-8de5-b270104ebb23\" (UID: \"5df38d69-816c-41c3-8de5-b270104ebb23\") " Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.489453 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-697995ff8c-7vbhx"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.503638 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df38d69-816c-41c3-8de5-b270104ebb23-kube-api-access-rxzfn" (OuterVolumeSpecName: "kube-api-access-rxzfn") pod "5df38d69-816c-41c3-8de5-b270104ebb23" (UID: "5df38d69-816c-41c3-8de5-b270104ebb23"). InnerVolumeSpecName "kube-api-access-rxzfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.525512 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-697995ff8c-7vbhx"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.553916 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": read tcp 10.217.0.2:42218->10.217.0.166:8776: read: connection reset by peer" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.593493 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" path="/var/lib/kubelet/pods/0be9cfb2-36bd-45a7-8d15-1603cb76780a/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.595061 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ff0676d-7674-42db-a71e-bba83d7e093e" path="/var/lib/kubelet/pods/0ff0676d-7674-42db-a71e-bba83d7e093e/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.596873 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" path="/var/lib/kubelet/pods/2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.601171 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d3e722-26a6-40fa-9762-7da59b0009b7" path="/var/lib/kubelet/pods/42d3e722-26a6-40fa-9762-7da59b0009b7/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.602847 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="447e0e13-620c-40bb-b13c-5f9e7d5bba4a" path="/var/lib/kubelet/pods/447e0e13-620c-40bb-b13c-5f9e7d5bba4a/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.603292 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa9e5fe-caca-4b52-b66d-5869e1e67ab5" path="/var/lib/kubelet/pods/9fa9e5fe-caca-4b52-b66d-5869e1e67ab5/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.604068 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxzfn\" (UniqueName: \"kubernetes.io/projected/5df38d69-816c-41c3-8de5-b270104ebb23-kube-api-access-rxzfn\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.605428 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa1a60e3-da79-4605-8b0b-329ac33c07a9" path="/var/lib/kubelet/pods/aa1a60e3-da79-4605-8b0b-329ac33c07a9/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.605855 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea53287-9722-47c6-a937-8a267b981e92" path="/var/lib/kubelet/pods/aea53287-9722-47c6-a937-8a267b981e92/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.606406 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b444a5f8-8311-488c-b612-2d44328edc52" path="/var/lib/kubelet/pods/b444a5f8-8311-488c-b612-2d44328edc52/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.612543 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65588a5-9e57-4d62-8abf-c0154251b6eb" path="/var/lib/kubelet/pods/c65588a5-9e57-4d62-8abf-c0154251b6eb/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.613135 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7d4540-2c3e-45ad-88b7-544734bb0413" path="/var/lib/kubelet/pods/df7d4540-2c3e-45ad-88b7-544734bb0413/volumes" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.613877 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xbqjk"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.613900 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2hzxx"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.613913 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xbqjk"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.613927 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2hzxx"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.613939 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystonede27-account-delete-4rlbx"] Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614182 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa9e5fe-caca-4b52-b66d-5869e1e67ab5" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614200 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa9e5fe-caca-4b52-b66d-5869e1e67ab5" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614214 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerName="openstack-network-exporter" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614220 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerName="openstack-network-exporter" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614229 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65588a5-9e57-4d62-8abf-c0154251b6eb" containerName="dnsmasq-dns" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614236 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65588a5-9e57-4d62-8abf-c0154251b6eb" containerName="dnsmasq-dns" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614253 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7d4540-2c3e-45ad-88b7-544734bb0413" containerName="barbican-keystone-listener" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614260 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7d4540-2c3e-45ad-88b7-544734bb0413" containerName="barbican-keystone-listener" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614271 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df38d69-816c-41c3-8de5-b270104ebb23" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614277 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df38d69-816c-41c3-8de5-b270104ebb23" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614286 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1a60e3-da79-4605-8b0b-329ac33c07a9" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614292 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1a60e3-da79-4605-8b0b-329ac33c07a9" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614301 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d3e722-26a6-40fa-9762-7da59b0009b7" containerName="galera" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614307 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d3e722-26a6-40fa-9762-7da59b0009b7" containerName="galera" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614320 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" containerName="barbican-worker-log" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614326 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" containerName="barbican-worker-log" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614338 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" containerName="ovn-controller" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614344 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" containerName="ovn-controller" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614350 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff0676d-7674-42db-a71e-bba83d7e093e" containerName="proxy-server" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614355 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff0676d-7674-42db-a71e-bba83d7e093e" containerName="proxy-server" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614366 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerName="ovsdbserver-nb" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614372 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerName="ovsdbserver-nb" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614380 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65588a5-9e57-4d62-8abf-c0154251b6eb" containerName="init" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614387 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65588a5-9e57-4d62-8abf-c0154251b6eb" containerName="init" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614398 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" containerName="barbican-worker" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614404 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" containerName="barbican-worker" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614417 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40580e5d-8c54-477e-af15-1ba2cf5d3dc0" containerName="openstack-network-exporter" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614423 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="40580e5d-8c54-477e-af15-1ba2cf5d3dc0" containerName="openstack-network-exporter" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614433 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7d4540-2c3e-45ad-88b7-544734bb0413" containerName="barbican-keystone-listener-log" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614439 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7d4540-2c3e-45ad-88b7-544734bb0413" containerName="barbican-keystone-listener-log" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614448 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b444a5f8-8311-488c-b612-2d44328edc52" containerName="openstack-network-exporter" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614454 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b444a5f8-8311-488c-b612-2d44328edc52" containerName="openstack-network-exporter" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614464 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff0676d-7674-42db-a71e-bba83d7e093e" containerName="proxy-httpd" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614470 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff0676d-7674-42db-a71e-bba83d7e093e" containerName="proxy-httpd" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614479 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="447e0e13-620c-40bb-b13c-5f9e7d5bba4a" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614487 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="447e0e13-620c-40bb-b13c-5f9e7d5bba4a" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614496 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b444a5f8-8311-488c-b612-2d44328edc52" containerName="ovsdbserver-sb" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614501 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b444a5f8-8311-488c-b612-2d44328edc52" containerName="ovsdbserver-sb" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614512 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea53287-9722-47c6-a937-8a267b981e92" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614518 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea53287-9722-47c6-a937-8a267b981e92" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.614527 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d3e722-26a6-40fa-9762-7da59b0009b7" containerName="mysql-bootstrap" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.614532 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d3e722-26a6-40fa-9762-7da59b0009b7" containerName="mysql-bootstrap" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615137 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7d4540-2c3e-45ad-88b7-544734bb0413" containerName="barbican-keystone-listener-log" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615155 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa9e5fe-caca-4b52-b66d-5869e1e67ab5" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615165 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65588a5-9e57-4d62-8abf-c0154251b6eb" containerName="dnsmasq-dns" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615178 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7d4540-2c3e-45ad-88b7-544734bb0413" containerName="barbican-keystone-listener" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615192 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerName="openstack-network-exporter" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615199 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bcb31ee-4374-46aa-ab52-39f216f2bf67" containerName="ovsdbserver-nb" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615206 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa1a60e3-da79-4605-8b0b-329ac33c07a9" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615218 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff0676d-7674-42db-a71e-bba83d7e093e" containerName="proxy-server" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615229 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff0676d-7674-42db-a71e-bba83d7e093e" containerName="proxy-httpd" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615239 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b444a5f8-8311-488c-b612-2d44328edc52" containerName="openstack-network-exporter" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615247 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" containerName="barbican-worker-log" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615255 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="40580e5d-8c54-477e-af15-1ba2cf5d3dc0" containerName="openstack-network-exporter" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615265 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d3e722-26a6-40fa-9762-7da59b0009b7" containerName="galera" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615271 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ccb5f35-791b-45bb-9f93-d2e70a2dd4b7" containerName="ovn-controller" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615280 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be9cfb2-36bd-45a7-8d15-1603cb76780a" containerName="barbican-worker" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615287 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="447e0e13-620c-40bb-b13c-5f9e7d5bba4a" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615296 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df38d69-816c-41c3-8de5-b270104ebb23" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615306 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b444a5f8-8311-488c-b612-2d44328edc52" containerName="ovsdbserver-sb" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615317 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea53287-9722-47c6-a937-8a267b981e92" containerName="mariadb-account-delete" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.615840 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonede27-account-delete-4rlbx" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.620018 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystonede27-account-delete-4rlbx"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.624346 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-59bd69c9bf-v6zw6"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.624541 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-59bd69c9bf-v6zw6" podUID="18ce3abd-750d-48db-a75f-e3a0d44e042d" containerName="keystone-api" containerID="cri-o://4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207" gracePeriod=30 Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.629540 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.705777 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsnrq\" (UniqueName: \"kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq\") pod \"keystonede27-account-delete-4rlbx\" (UID: \"824e459b-c900-47ae-ae65-63c4ffb58fcc\") " pod="openstack/keystonede27-account-delete-4rlbx" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.808451 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" containerName="galera" containerID="cri-o://906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080" gracePeriod=30 Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.808941 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsnrq\" (UniqueName: \"kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq\") pod \"keystonede27-account-delete-4rlbx\" (UID: \"824e459b-c900-47ae-ae65-63c4ffb58fcc\") " pod="openstack/keystonede27-account-delete-4rlbx" Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.816195 4763 projected.go:194] Error preparing data for projected volume kube-api-access-vsnrq for pod openstack/keystonede27-account-delete-4rlbx: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 06 15:15:23 crc kubenswrapper[4763]: E1006 15:15:23.816254 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq podName:824e459b-c900-47ae-ae65-63c4ffb58fcc nodeName:}" failed. No retries permitted until 2025-10-06 15:15:24.316238474 +0000 UTC m=+1321.471530986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vsnrq" (UniqueName: "kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq") pod "keystonede27-account-delete-4rlbx" (UID: "824e459b-c900-47ae-ae65-63c4ffb58fcc") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.851486 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1ca15-account-delete-fvfnr" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.867014 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.910552 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-scripts\") pod \"c465d0a4-ce55-49ff-bdd4-62585989b25b\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.910673 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-config-data\") pod \"c465d0a4-ce55-49ff-bdd4-62585989b25b\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.910735 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dj9c\" (UniqueName: \"kubernetes.io/projected/2235b0e6-860e-450c-b129-f0082e1670e1-kube-api-access-5dj9c\") pod \"2235b0e6-860e-450c-b129-f0082e1670e1\" (UID: \"2235b0e6-860e-450c-b129-f0082e1670e1\") " Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.910784 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-public-tls-certs\") pod \"c465d0a4-ce55-49ff-bdd4-62585989b25b\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.910843 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d72n\" (UniqueName: \"kubernetes.io/projected/c465d0a4-ce55-49ff-bdd4-62585989b25b-kube-api-access-2d72n\") pod \"c465d0a4-ce55-49ff-bdd4-62585989b25b\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.910931 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c465d0a4-ce55-49ff-bdd4-62585989b25b-logs\") pod \"c465d0a4-ce55-49ff-bdd4-62585989b25b\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.910973 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-combined-ca-bundle\") pod \"c465d0a4-ce55-49ff-bdd4-62585989b25b\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.911090 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-internal-tls-certs\") pod \"c465d0a4-ce55-49ff-bdd4-62585989b25b\" (UID: \"c465d0a4-ce55-49ff-bdd4-62585989b25b\") " Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.911228 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:55050->10.217.0.204:8775: read: connection reset by peer" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.911273 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:55034->10.217.0.204:8775: read: connection reset by peer" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.911689 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c465d0a4-ce55-49ff-bdd4-62585989b25b-logs" (OuterVolumeSpecName: "logs") pod "c465d0a4-ce55-49ff-bdd4-62585989b25b" (UID: "c465d0a4-ce55-49ff-bdd4-62585989b25b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.921628 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2235b0e6-860e-450c-b129-f0082e1670e1-kube-api-access-5dj9c" (OuterVolumeSpecName: "kube-api-access-5dj9c") pod "2235b0e6-860e-450c-b129-f0082e1670e1" (UID: "2235b0e6-860e-450c-b129-f0082e1670e1"). InnerVolumeSpecName "kube-api-access-5dj9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.922834 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c465d0a4-ce55-49ff-bdd4-62585989b25b-kube-api-access-2d72n" (OuterVolumeSpecName: "kube-api-access-2d72n") pod "c465d0a4-ce55-49ff-bdd4-62585989b25b" (UID: "c465d0a4-ce55-49ff-bdd4-62585989b25b"). InnerVolumeSpecName "kube-api-access-2d72n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.936874 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-scripts" (OuterVolumeSpecName: "scripts") pod "c465d0a4-ce55-49ff-bdd4-62585989b25b" (UID: "c465d0a4-ce55-49ff-bdd4-62585989b25b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:23 crc kubenswrapper[4763]: I1006 15:15:23.980806 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c465d0a4-ce55-49ff-bdd4-62585989b25b" (UID: "c465d0a4-ce55-49ff-bdd4-62585989b25b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:23.993236 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-config-data" (OuterVolumeSpecName: "config-data") pod "c465d0a4-ce55-49ff-bdd4-62585989b25b" (UID: "c465d0a4-ce55-49ff-bdd4-62585989b25b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.000025 4763 generic.go:334] "Generic (PLEG): container finished" podID="0c33271c-af2f-43e4-adaf-9a81ef747ee5" containerID="b5a74a7863d471d5ebd8edc913311b4b8759f4a34d7919b70ef31a57831def5c" exitCode=2 Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.000081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0c33271c-af2f-43e4-adaf-9a81ef747ee5","Type":"ContainerDied","Data":"b5a74a7863d471d5ebd8edc913311b4b8759f4a34d7919b70ef31a57831def5c"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.002230 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancebac8-account-delete-rw77w" event={"ID":"5df38d69-816c-41c3-8de5-b270104ebb23","Type":"ContainerDied","Data":"9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.002258 4763 scope.go:117] "RemoveContainer" containerID="580fe997edd3a3f7db611460136856b9add2321662e07511e33a88c27f7fb783" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.002347 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancebac8-account-delete-rw77w" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.012858 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.012878 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.012887 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.012895 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dj9c\" (UniqueName: \"kubernetes.io/projected/2235b0e6-860e-450c-b129-f0082e1670e1-kube-api-access-5dj9c\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.012904 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d72n\" (UniqueName: \"kubernetes.io/projected/c465d0a4-ce55-49ff-bdd4-62585989b25b-kube-api-access-2d72n\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.012912 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c465d0a4-ce55-49ff-bdd4-62585989b25b-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.018346 4763 generic.go:334] "Generic (PLEG): container finished" podID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" containerID="d1eb8ece3ed05071e8f0e183f59d758fa3e0c6899e555329438a62699f5c4813" exitCode=0 Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.018418 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d08ec27f-a0b7-4146-8378-8bfb3e460e05","Type":"ContainerDied","Data":"d1eb8ece3ed05071e8f0e183f59d758fa3e0c6899e555329438a62699f5c4813"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.029190 4763 generic.go:334] "Generic (PLEG): container finished" podID="14a424ce-ef7d-4b9c-965e-b821798d3f78" containerID="08a1f33d966b3e95b4e43e2c4c9605f94fb70073e52950d5c8d680e81f2eeaa1" exitCode=0 Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.029265 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14a424ce-ef7d-4b9c-965e-b821798d3f78","Type":"ContainerDied","Data":"08a1f33d966b3e95b4e43e2c4c9605f94fb70073e52950d5c8d680e81f2eeaa1"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.065292 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c465d0a4-ce55-49ff-bdd4-62585989b25b" (UID: "c465d0a4-ce55-49ff-bdd4-62585989b25b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.065476 4763 generic.go:334] "Generic (PLEG): container finished" podID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerID="3371eb2323b8c35c65358b46ed5e998f521b7d07b05bcf5e5f18ac2589423079" exitCode=0 Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.065554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18b427d1-75e9-4b32-afeb-f895661ddbe1","Type":"ContainerDied","Data":"3371eb2323b8c35c65358b46ed5e998f521b7d07b05bcf5e5f18ac2589423079"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.086101 4763 generic.go:334] "Generic (PLEG): container finished" podID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerID="6e23f0e5ee7363ef074a84ef0fdf33f87ae71f9dfadd96d75eeac46f6868a70c" exitCode=0 Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.086191 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20b750fb-21cc-4a04-ba58-bddcbc2161e7","Type":"ContainerDied","Data":"6e23f0e5ee7363ef074a84ef0fdf33f87ae71f9dfadd96d75eeac46f6868a70c"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.093030 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancebac8-account-delete-rw77w"] Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.099250 4763 generic.go:334] "Generic (PLEG): container finished" podID="c465d0a4-ce55-49ff-bdd4-62585989b25b" containerID="ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1" exitCode=0 Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.099315 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79c97876dd-6hbjr" event={"ID":"c465d0a4-ce55-49ff-bdd4-62585989b25b","Type":"ContainerDied","Data":"ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.099496 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79c97876dd-6hbjr" event={"ID":"c465d0a4-ce55-49ff-bdd4-62585989b25b","Type":"ContainerDied","Data":"9b2c25fc5c7409f12cebdeab232ccd2d28aaf39b2736a313ee43a5668ff91f62"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.099514 4763 scope.go:117] "RemoveContainer" containerID="ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.099631 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79c97876dd-6hbjr" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.100248 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glancebac8-account-delete-rw77w"] Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.104256 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1ca15-account-delete-fvfnr" event={"ID":"2235b0e6-860e-450c-b129-f0082e1670e1","Type":"ContainerDied","Data":"a1922ef6722a84d4a6330874bda4dad5ff9dbf1b6f0adeac1a77237f804b80e4"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.104274 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1ca15-account-delete-fvfnr" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.108416 4763 generic.go:334] "Generic (PLEG): container finished" podID="78411411-8959-4af9-9396-864a5dc9f0b1" containerID="c09071984a4aec419c3daa2a55dbf3605ca256c2a5199a60850abcd7fd764271" exitCode=0 Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.108448 4763 generic.go:334] "Generic (PLEG): container finished" podID="78411411-8959-4af9-9396-864a5dc9f0b1" containerID="b7afa41ebd51bd83378246bbfccfbc1dec4a81fdcfd2d71bd611a3acb01ef475" exitCode=2 Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.108460 4763 generic.go:334] "Generic (PLEG): container finished" podID="78411411-8959-4af9-9396-864a5dc9f0b1" containerID="08f39e51da65908a2dcee840d82e7c6b6fa5d8d45e73728b189ebdf1b5b89b0d" exitCode=0 Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.108502 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78411411-8959-4af9-9396-864a5dc9f0b1","Type":"ContainerDied","Data":"c09071984a4aec419c3daa2a55dbf3605ca256c2a5199a60850abcd7fd764271"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.108527 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78411411-8959-4af9-9396-864a5dc9f0b1","Type":"ContainerDied","Data":"b7afa41ebd51bd83378246bbfccfbc1dec4a81fdcfd2d71bd611a3acb01ef475"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.108540 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78411411-8959-4af9-9396-864a5dc9f0b1","Type":"ContainerDied","Data":"08f39e51da65908a2dcee840d82e7c6b6fa5d8d45e73728b189ebdf1b5b89b0d"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.109838 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-85496b9568-j6pjn" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:35250->10.217.0.158:9311: read: connection reset by peer" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.109920 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-85496b9568-j6pjn" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:35260->10.217.0.158:9311: read: connection reset by peer" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.114005 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.115505 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c465d0a4-ce55-49ff-bdd4-62585989b25b" (UID: "c465d0a4-ce55-49ff-bdd4-62585989b25b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.137065 4763 generic.go:334] "Generic (PLEG): container finished" podID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerID="adcaf8c8d3f281041745bafa4ac6ec897a52c954a8e11f85b8df351c7b2de708" exitCode=0 Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.137164 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5bebf8d6-16bb-4dcf-afac-6a1a55e81350","Type":"ContainerDied","Data":"adcaf8c8d3f281041745bafa4ac6ec897a52c954a8e11f85b8df351c7b2de708"} Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.150910 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.155700 4763 scope.go:117] "RemoveContainer" containerID="260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.215095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-public-tls-certs\") pod \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.215390 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlkj2\" (UniqueName: \"kubernetes.io/projected/20b750fb-21cc-4a04-ba58-bddcbc2161e7-kube-api-access-vlkj2\") pod \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.215420 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-internal-tls-certs\") pod \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.215479 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data\") pod \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.215518 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-scripts\") pod \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.215554 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-combined-ca-bundle\") pod \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.215589 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data-custom\") pod \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.215675 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20b750fb-21cc-4a04-ba58-bddcbc2161e7-etc-machine-id\") pod \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.215724 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b750fb-21cc-4a04-ba58-bddcbc2161e7-logs\") pod \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\" (UID: \"20b750fb-21cc-4a04-ba58-bddcbc2161e7\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.216110 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c465d0a4-ce55-49ff-bdd4-62585989b25b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.216513 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20b750fb-21cc-4a04-ba58-bddcbc2161e7-logs" (OuterVolumeSpecName: "logs") pod "20b750fb-21cc-4a04-ba58-bddcbc2161e7" (UID: "20b750fb-21cc-4a04-ba58-bddcbc2161e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.217101 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.217546 4763 scope.go:117] "RemoveContainer" containerID="ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1" Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.217604 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.221074 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.221216 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.221327 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1\": container with ID starting with ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1 not found: ID does not exist" containerID="ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.221355 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1"} err="failed to get container status \"ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1\": rpc error: code = NotFound desc = could not find container \"ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1\": container with ID starting with ae7e45fd26b6481c205f6483d976e7fad5c355d4f9760a1c9053c504e5720de1 not found: ID does not exist" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.221378 4763 scope.go:117] "RemoveContainer" containerID="260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.223720 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-scripts" (OuterVolumeSpecName: "scripts") pod "20b750fb-21cc-4a04-ba58-bddcbc2161e7" (UID: "20b750fb-21cc-4a04-ba58-bddcbc2161e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.223766 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b750fb-21cc-4a04-ba58-bddcbc2161e7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "20b750fb-21cc-4a04-ba58-bddcbc2161e7" (UID: "20b750fb-21cc-4a04-ba58-bddcbc2161e7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.223876 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.223914 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovs-vswitchd" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.224034 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1ca15-account-delete-fvfnr"] Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.224150 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.224711 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "20b750fb-21cc-4a04-ba58-bddcbc2161e7" (UID: "20b750fb-21cc-4a04-ba58-bddcbc2161e7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.225201 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3\": container with ID starting with 260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3 not found: ID does not exist" containerID="260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.225224 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3"} err="failed to get container status \"260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3\": rpc error: code = NotFound desc = could not find container \"260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3\": container with ID starting with 260bfab1ef8bc7850d49eb35069cf031dbcecc5e4928afffc6f8a504950f40b3 not found: ID does not exist" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.225245 4763 scope.go:117] "RemoveContainer" containerID="bc021661ad332a99756df1e9782c3d0c921c860c5720b780cd5afc97f614afeb" Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.226128 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.228148 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.237906 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b750fb-21cc-4a04-ba58-bddcbc2161e7-kube-api-access-vlkj2" (OuterVolumeSpecName: "kube-api-access-vlkj2") pod "20b750fb-21cc-4a04-ba58-bddcbc2161e7" (UID: "20b750fb-21cc-4a04-ba58-bddcbc2161e7"). InnerVolumeSpecName "kube-api-access-vlkj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.238960 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell1ca15-account-delete-fvfnr"] Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.273989 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data" (OuterVolumeSpecName: "config-data") pod "20b750fb-21cc-4a04-ba58-bddcbc2161e7" (UID: "20b750fb-21cc-4a04-ba58-bddcbc2161e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.276871 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.277424 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20b750fb-21cc-4a04-ba58-bddcbc2161e7" (UID: "20b750fb-21cc-4a04-ba58-bddcbc2161e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.289977 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "20b750fb-21cc-4a04-ba58-bddcbc2161e7" (UID: "20b750fb-21cc-4a04-ba58-bddcbc2161e7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.298222 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "20b750fb-21cc-4a04-ba58-bddcbc2161e7" (UID: "20b750fb-21cc-4a04-ba58-bddcbc2161e7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316550 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rggx2\" (UniqueName: \"kubernetes.io/projected/d08ec27f-a0b7-4146-8378-8bfb3e460e05-kube-api-access-rggx2\") pod \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316598 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78f9k\" (UniqueName: \"kubernetes.io/projected/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-kube-api-access-78f9k\") pod \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316636 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-config-data\") pod \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316652 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-combined-ca-bundle\") pod \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316681 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-config-data\") pod \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316739 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-httpd-run\") pod \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316776 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316812 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-logs\") pod \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316831 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-scripts\") pod \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316871 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-internal-tls-certs\") pod \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316897 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-logs\") pod \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316918 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-public-tls-certs\") pod \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\" (UID: \"5bebf8d6-16bb-4dcf-afac-6a1a55e81350\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.316935 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-combined-ca-bundle\") pod \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.317008 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-public-tls-certs\") pod \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\" (UID: \"d08ec27f-a0b7-4146-8378-8bfb3e460e05\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.317270 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsnrq\" (UniqueName: \"kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq\") pod \"keystonede27-account-delete-4rlbx\" (UID: \"824e459b-c900-47ae-ae65-63c4ffb58fcc\") " pod="openstack/keystonede27-account-delete-4rlbx" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.317318 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.317331 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.317342 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20b750fb-21cc-4a04-ba58-bddcbc2161e7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.317352 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b750fb-21cc-4a04-ba58-bddcbc2161e7-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.317362 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.324783 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d08ec27f-a0b7-4146-8378-8bfb3e460e05" (UID: "d08ec27f-a0b7-4146-8378-8bfb3e460e05"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.325559 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08ec27f-a0b7-4146-8378-8bfb3e460e05-kube-api-access-rggx2" (OuterVolumeSpecName: "kube-api-access-rggx2") pod "d08ec27f-a0b7-4146-8378-8bfb3e460e05" (UID: "d08ec27f-a0b7-4146-8378-8bfb3e460e05"). InnerVolumeSpecName "kube-api-access-rggx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.326384 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-logs" (OuterVolumeSpecName: "logs") pod "5bebf8d6-16bb-4dcf-afac-6a1a55e81350" (UID: "5bebf8d6-16bb-4dcf-afac-6a1a55e81350"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.326413 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlkj2\" (UniqueName: \"kubernetes.io/projected/20b750fb-21cc-4a04-ba58-bddcbc2161e7-kube-api-access-vlkj2\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.326592 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.326679 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.326747 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b750fb-21cc-4a04-ba58-bddcbc2161e7-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.332653 4763 projected.go:194] Error preparing data for projected volume kube-api-access-vsnrq for pod openstack/keystonede27-account-delete-4rlbx: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 06 15:15:24 crc kubenswrapper[4763]: E1006 15:15:24.332731 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq podName:824e459b-c900-47ae-ae65-63c4ffb58fcc nodeName:}" failed. No retries permitted until 2025-10-06 15:15:25.33271304 +0000 UTC m=+1322.488005552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vsnrq" (UniqueName: "kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq") pod "keystonede27-account-delete-4rlbx" (UID: "824e459b-c900-47ae-ae65-63c4ffb58fcc") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.333406 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "d08ec27f-a0b7-4146-8378-8bfb3e460e05" (UID: "d08ec27f-a0b7-4146-8378-8bfb3e460e05"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.333666 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-logs" (OuterVolumeSpecName: "logs") pod "d08ec27f-a0b7-4146-8378-8bfb3e460e05" (UID: "d08ec27f-a0b7-4146-8378-8bfb3e460e05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.351230 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-kube-api-access-78f9k" (OuterVolumeSpecName: "kube-api-access-78f9k") pod "5bebf8d6-16bb-4dcf-afac-6a1a55e81350" (UID: "5bebf8d6-16bb-4dcf-afac-6a1a55e81350"). InnerVolumeSpecName "kube-api-access-78f9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.352478 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-scripts" (OuterVolumeSpecName: "scripts") pod "d08ec27f-a0b7-4146-8378-8bfb3e460e05" (UID: "d08ec27f-a0b7-4146-8378-8bfb3e460e05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.365200 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bebf8d6-16bb-4dcf-afac-6a1a55e81350" (UID: "5bebf8d6-16bb-4dcf-afac-6a1a55e81350"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.370527 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d08ec27f-a0b7-4146-8378-8bfb3e460e05" (UID: "d08ec27f-a0b7-4146-8378-8bfb3e460e05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.437101 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.437183 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.437194 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d08ec27f-a0b7-4146-8378-8bfb3e460e05-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.437205 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.437214 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.437400 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.437416 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rggx2\" (UniqueName: \"kubernetes.io/projected/d08ec27f-a0b7-4146-8378-8bfb3e460e05-kube-api-access-rggx2\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.437425 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78f9k\" (UniqueName: \"kubernetes.io/projected/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-kube-api-access-78f9k\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.437434 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.466017 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-config-data" (OuterVolumeSpecName: "config-data") pod "5bebf8d6-16bb-4dcf-afac-6a1a55e81350" (UID: "5bebf8d6-16bb-4dcf-afac-6a1a55e81350"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.466506 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-config-data" (OuterVolumeSpecName: "config-data") pod "d08ec27f-a0b7-4146-8378-8bfb3e460e05" (UID: "d08ec27f-a0b7-4146-8378-8bfb3e460e05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.468182 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5bebf8d6-16bb-4dcf-afac-6a1a55e81350" (UID: "5bebf8d6-16bb-4dcf-afac-6a1a55e81350"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.471153 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5bebf8d6-16bb-4dcf-afac-6a1a55e81350" (UID: "5bebf8d6-16bb-4dcf-afac-6a1a55e81350"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.481134 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d08ec27f-a0b7-4146-8378-8bfb3e460e05" (UID: "d08ec27f-a0b7-4146-8378-8bfb3e460e05"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.494782 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.552406 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.552901 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.552915 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.552928 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.552939 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08ec27f-a0b7-4146-8378-8bfb3e460e05-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.552950 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bebf8d6-16bb-4dcf-afac-6a1a55e81350-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.554577 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2fad9bbe-33dc-4f1d-a156-52bbd3a69273" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.806107 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.818004 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.844637 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-79c97876dd-6hbjr"] Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.846249 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.854600 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.858568 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsxtd\" (UniqueName: \"kubernetes.io/projected/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-api-access-tsxtd\") pod \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.858730 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-combined-ca-bundle\") pod \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.858835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-certs\") pod \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.858903 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-config\") pod \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\" (UID: \"0c33271c-af2f-43e4-adaf-9a81ef747ee5\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.861280 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.866861 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-79c97876dd-6hbjr"] Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.866985 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-api-access-tsxtd" (OuterVolumeSpecName: "kube-api-access-tsxtd") pod "0c33271c-af2f-43e4-adaf-9a81ef747ee5" (UID: "0c33271c-af2f-43e4-adaf-9a81ef747ee5"). InnerVolumeSpecName "kube-api-access-tsxtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.929808 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c33271c-af2f-43e4-adaf-9a81ef747ee5" (UID: "0c33271c-af2f-43e4-adaf-9a81ef747ee5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.935663 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "0c33271c-af2f-43e4-adaf-9a81ef747ee5" (UID: "0c33271c-af2f-43e4-adaf-9a81ef747ee5"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.959937 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-config-data\") pod \"18b427d1-75e9-4b32-afeb-f895661ddbe1\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.959972 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-combined-ca-bundle\") pod \"45540131-5bd7-47c8-bab3-da9362ab3aa3\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.959993 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdwvc\" (UniqueName: \"kubernetes.io/projected/18b427d1-75e9-4b32-afeb-f895661ddbe1-kube-api-access-mdwvc\") pod \"18b427d1-75e9-4b32-afeb-f895661ddbe1\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960035 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18b427d1-75e9-4b32-afeb-f895661ddbe1-logs\") pod \"18b427d1-75e9-4b32-afeb-f895661ddbe1\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960060 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-public-tls-certs\") pod \"7909e384-b1c8-476c-801d-8b60015ccdc4\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960076 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qlpm\" (UniqueName: \"kubernetes.io/projected/7909e384-b1c8-476c-801d-8b60015ccdc4-kube-api-access-2qlpm\") pod \"7909e384-b1c8-476c-801d-8b60015ccdc4\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960138 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-config-data\") pod \"45540131-5bd7-47c8-bab3-da9362ab3aa3\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960163 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn5tc\" (UniqueName: \"kubernetes.io/projected/45540131-5bd7-47c8-bab3-da9362ab3aa3-kube-api-access-jn5tc\") pod \"45540131-5bd7-47c8-bab3-da9362ab3aa3\" (UID: \"45540131-5bd7-47c8-bab3-da9362ab3aa3\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960201 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-combined-ca-bundle\") pod \"7909e384-b1c8-476c-801d-8b60015ccdc4\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960232 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-nova-metadata-tls-certs\") pod \"18b427d1-75e9-4b32-afeb-f895661ddbe1\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960278 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7909e384-b1c8-476c-801d-8b60015ccdc4-logs\") pod \"7909e384-b1c8-476c-801d-8b60015ccdc4\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960299 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data-custom\") pod \"7909e384-b1c8-476c-801d-8b60015ccdc4\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960334 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data\") pod \"7909e384-b1c8-476c-801d-8b60015ccdc4\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960361 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-internal-tls-certs\") pod \"7909e384-b1c8-476c-801d-8b60015ccdc4\" (UID: \"7909e384-b1c8-476c-801d-8b60015ccdc4\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960377 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-combined-ca-bundle\") pod \"18b427d1-75e9-4b32-afeb-f895661ddbe1\" (UID: \"18b427d1-75e9-4b32-afeb-f895661ddbe1\") " Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960731 4763 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960745 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsxtd\" (UniqueName: \"kubernetes.io/projected/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-api-access-tsxtd\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.960754 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.965392 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7909e384-b1c8-476c-801d-8b60015ccdc4-logs" (OuterVolumeSpecName: "logs") pod "7909e384-b1c8-476c-801d-8b60015ccdc4" (UID: "7909e384-b1c8-476c-801d-8b60015ccdc4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.965389 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18b427d1-75e9-4b32-afeb-f895661ddbe1-logs" (OuterVolumeSpecName: "logs") pod "18b427d1-75e9-4b32-afeb-f895661ddbe1" (UID: "18b427d1-75e9-4b32-afeb-f895661ddbe1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.965662 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b427d1-75e9-4b32-afeb-f895661ddbe1-kube-api-access-mdwvc" (OuterVolumeSpecName: "kube-api-access-mdwvc") pod "18b427d1-75e9-4b32-afeb-f895661ddbe1" (UID: "18b427d1-75e9-4b32-afeb-f895661ddbe1"). InnerVolumeSpecName "kube-api-access-mdwvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.966275 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "0c33271c-af2f-43e4-adaf-9a81ef747ee5" (UID: "0c33271c-af2f-43e4-adaf-9a81ef747ee5"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.969762 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7909e384-b1c8-476c-801d-8b60015ccdc4" (UID: "7909e384-b1c8-476c-801d-8b60015ccdc4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.974570 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b7530761-b715-4178-8d58-5e1cd54838d0/ovn-northd/0.log" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.975583 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.976823 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45540131-5bd7-47c8-bab3-da9362ab3aa3-kube-api-access-jn5tc" (OuterVolumeSpecName: "kube-api-access-jn5tc") pod "45540131-5bd7-47c8-bab3-da9362ab3aa3" (UID: "45540131-5bd7-47c8-bab3-da9362ab3aa3"). InnerVolumeSpecName "kube-api-access-jn5tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:24 crc kubenswrapper[4763]: I1006 15:15:24.990847 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7909e384-b1c8-476c-801d-8b60015ccdc4-kube-api-access-2qlpm" (OuterVolumeSpecName: "kube-api-access-2qlpm") pod "7909e384-b1c8-476c-801d-8b60015ccdc4" (UID: "7909e384-b1c8-476c-801d-8b60015ccdc4"). InnerVolumeSpecName "kube-api-access-2qlpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.002997 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-config-data" (OuterVolumeSpecName: "config-data") pod "18b427d1-75e9-4b32-afeb-f895661ddbe1" (UID: "18b427d1-75e9-4b32-afeb-f895661ddbe1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.010784 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7909e384-b1c8-476c-801d-8b60015ccdc4" (UID: "7909e384-b1c8-476c-801d-8b60015ccdc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.014708 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab is running failed: container process not found" containerID="eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.019116 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab is running failed: container process not found" containerID="eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.019480 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab is running failed: container process not found" containerID="eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.019546 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5c67adc5-b329-4832-a9e6-711a70d0021e" containerName="nova-scheduler-scheduler" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.042647 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.049743 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18b427d1-75e9-4b32-afeb-f895661ddbe1" (UID: "18b427d1-75e9-4b32-afeb-f895661ddbe1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.063532 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-config\") pod \"b7530761-b715-4178-8d58-5e1cd54838d0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.063563 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-northd-tls-certs\") pod \"b7530761-b715-4178-8d58-5e1cd54838d0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.063585 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krdsp\" (UniqueName: \"kubernetes.io/projected/b7530761-b715-4178-8d58-5e1cd54838d0-kube-api-access-krdsp\") pod \"b7530761-b715-4178-8d58-5e1cd54838d0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.063680 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-rundir\") pod \"b7530761-b715-4178-8d58-5e1cd54838d0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.063730 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-scripts\") pod \"b7530761-b715-4178-8d58-5e1cd54838d0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.063763 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-metrics-certs-tls-certs\") pod \"b7530761-b715-4178-8d58-5e1cd54838d0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.063791 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-combined-ca-bundle\") pod \"b7530761-b715-4178-8d58-5e1cd54838d0\" (UID: \"b7530761-b715-4178-8d58-5e1cd54838d0\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.064125 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.064136 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.064144 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdwvc\" (UniqueName: \"kubernetes.io/projected/18b427d1-75e9-4b32-afeb-f895661ddbe1-kube-api-access-mdwvc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.064153 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18b427d1-75e9-4b32-afeb-f895661ddbe1-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.064162 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qlpm\" (UniqueName: \"kubernetes.io/projected/7909e384-b1c8-476c-801d-8b60015ccdc4-kube-api-access-2qlpm\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.064170 4763 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c33271c-af2f-43e4-adaf-9a81ef747ee5-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.064179 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn5tc\" (UniqueName: \"kubernetes.io/projected/45540131-5bd7-47c8-bab3-da9362ab3aa3-kube-api-access-jn5tc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.064188 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.064196 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7909e384-b1c8-476c-801d-8b60015ccdc4-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.064204 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.067456 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-config" (OuterVolumeSpecName: "config") pod "b7530761-b715-4178-8d58-5e1cd54838d0" (UID: "b7530761-b715-4178-8d58-5e1cd54838d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.068075 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-scripts" (OuterVolumeSpecName: "scripts") pod "b7530761-b715-4178-8d58-5e1cd54838d0" (UID: "b7530761-b715-4178-8d58-5e1cd54838d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.068351 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "b7530761-b715-4178-8d58-5e1cd54838d0" (UID: "b7530761-b715-4178-8d58-5e1cd54838d0"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.071498 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45540131-5bd7-47c8-bab3-da9362ab3aa3" (UID: "45540131-5bd7-47c8-bab3-da9362ab3aa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.079056 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-config-data" (OuterVolumeSpecName: "config-data") pod "45540131-5bd7-47c8-bab3-da9362ab3aa3" (UID: "45540131-5bd7-47c8-bab3-da9362ab3aa3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.079301 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.080881 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7530761-b715-4178-8d58-5e1cd54838d0-kube-api-access-krdsp" (OuterVolumeSpecName: "kube-api-access-krdsp") pod "b7530761-b715-4178-8d58-5e1cd54838d0" (UID: "b7530761-b715-4178-8d58-5e1cd54838d0"). InnerVolumeSpecName "kube-api-access-krdsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.082442 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7909e384-b1c8-476c-801d-8b60015ccdc4" (UID: "7909e384-b1c8-476c-801d-8b60015ccdc4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.097547 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.103199 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7909e384-b1c8-476c-801d-8b60015ccdc4" (UID: "7909e384-b1c8-476c-801d-8b60015ccdc4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.131888 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7530761-b715-4178-8d58-5e1cd54838d0" (UID: "b7530761-b715-4178-8d58-5e1cd54838d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.142521 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data" (OuterVolumeSpecName: "config-data") pod "7909e384-b1c8-476c-801d-8b60015ccdc4" (UID: "7909e384-b1c8-476c-801d-8b60015ccdc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.149166 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "18b427d1-75e9-4b32-afeb-f895661ddbe1" (UID: "18b427d1-75e9-4b32-afeb-f895661ddbe1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165026 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-scripts\") pod \"14a424ce-ef7d-4b9c-965e-b821798d3f78\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165116 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-combined-ca-bundle\") pod \"14a424ce-ef7d-4b9c-965e-b821798d3f78\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165136 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-combined-ca-bundle\") pod \"95acc4bd-d14c-4204-b20e-36085edffb73\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165175 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-httpd-run\") pod \"14a424ce-ef7d-4b9c-965e-b821798d3f78\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165192 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-config-data\") pod \"5c67adc5-b329-4832-a9e6-711a70d0021e\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165214 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-config-data\") pod \"95acc4bd-d14c-4204-b20e-36085edffb73\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165260 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-logs\") pod \"14a424ce-ef7d-4b9c-965e-b821798d3f78\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165346 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt5hd\" (UniqueName: \"kubernetes.io/projected/5c67adc5-b329-4832-a9e6-711a70d0021e-kube-api-access-rt5hd\") pod \"5c67adc5-b329-4832-a9e6-711a70d0021e\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165367 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-memcached-tls-certs\") pod \"95acc4bd-d14c-4204-b20e-36085edffb73\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165390 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-internal-tls-certs\") pod \"14a424ce-ef7d-4b9c-965e-b821798d3f78\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165409 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6t66\" (UniqueName: \"kubernetes.io/projected/14a424ce-ef7d-4b9c-965e-b821798d3f78-kube-api-access-k6t66\") pod \"14a424ce-ef7d-4b9c-965e-b821798d3f78\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165441 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-combined-ca-bundle\") pod \"5c67adc5-b329-4832-a9e6-711a70d0021e\" (UID: \"5c67adc5-b329-4832-a9e6-711a70d0021e\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165506 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4w6c\" (UniqueName: \"kubernetes.io/projected/95acc4bd-d14c-4204-b20e-36085edffb73-kube-api-access-h4w6c\") pod \"95acc4bd-d14c-4204-b20e-36085edffb73\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.165528 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-config-data\") pod \"14a424ce-ef7d-4b9c-965e-b821798d3f78\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.168757 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-kolla-config\") pod \"95acc4bd-d14c-4204-b20e-36085edffb73\" (UID: \"95acc4bd-d14c-4204-b20e-36085edffb73\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.168835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"14a424ce-ef7d-4b9c-965e-b821798d3f78\" (UID: \"14a424ce-ef7d-4b9c-965e-b821798d3f78\") " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.169704 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-scripts" (OuterVolumeSpecName: "scripts") pod "14a424ce-ef7d-4b9c-965e-b821798d3f78" (UID: "14a424ce-ef7d-4b9c-965e-b821798d3f78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.170043 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "95acc4bd-d14c-4204-b20e-36085edffb73" (UID: "95acc4bd-d14c-4204-b20e-36085edffb73"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.170436 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-logs" (OuterVolumeSpecName: "logs") pod "14a424ce-ef7d-4b9c-965e-b821798d3f78" (UID: "14a424ce-ef7d-4b9c-965e-b821798d3f78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.170685 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "14a424ce-ef7d-4b9c-965e-b821798d3f78" (UID: "14a424ce-ef7d-4b9c-965e-b821798d3f78"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.170756 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-config-data" (OuterVolumeSpecName: "config-data") pod "95acc4bd-d14c-4204-b20e-36085edffb73" (UID: "95acc4bd-d14c-4204-b20e-36085edffb73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.173196 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b7530761-b715-4178-8d58-5e1cd54838d0" (UID: "b7530761-b715-4178-8d58-5e1cd54838d0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.173677 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95acc4bd-d14c-4204-b20e-36085edffb73-kube-api-access-h4w6c" (OuterVolumeSpecName: "kube-api-access-h4w6c") pod "95acc4bd-d14c-4204-b20e-36085edffb73" (UID: "95acc4bd-d14c-4204-b20e-36085edffb73"). InnerVolumeSpecName "kube-api-access-h4w6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174037 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174070 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174083 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174091 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174099 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174107 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174116 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174125 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a424ce-ef7d-4b9c-965e-b821798d3f78-logs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174133 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174142 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45540131-5bd7-47c8-bab3-da9362ab3aa3-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174150 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4w6c\" (UniqueName: \"kubernetes.io/projected/95acc4bd-d14c-4204-b20e-36085edffb73-kube-api-access-h4w6c\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174159 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/18b427d1-75e9-4b32-afeb-f895661ddbe1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174167 4763 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95acc4bd-d14c-4204-b20e-36085edffb73-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174175 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7530761-b715-4178-8d58-5e1cd54838d0-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174183 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krdsp\" (UniqueName: \"kubernetes.io/projected/b7530761-b715-4178-8d58-5e1cd54838d0-kube-api-access-krdsp\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174191 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174200 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.174209 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7909e384-b1c8-476c-801d-8b60015ccdc4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.176015 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c67adc5-b329-4832-a9e6-711a70d0021e-kube-api-access-rt5hd" (OuterVolumeSpecName: "kube-api-access-rt5hd") pod "5c67adc5-b329-4832-a9e6-711a70d0021e" (UID: "5c67adc5-b329-4832-a9e6-711a70d0021e"). InnerVolumeSpecName "kube-api-access-rt5hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.176381 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "14a424ce-ef7d-4b9c-965e-b821798d3f78" (UID: "14a424ce-ef7d-4b9c-965e-b821798d3f78"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.183157 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a424ce-ef7d-4b9c-965e-b821798d3f78-kube-api-access-k6t66" (OuterVolumeSpecName: "kube-api-access-k6t66") pod "14a424ce-ef7d-4b9c-965e-b821798d3f78" (UID: "14a424ce-ef7d-4b9c-965e-b821798d3f78"). InnerVolumeSpecName "kube-api-access-k6t66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.186258 4763 generic.go:334] "Generic (PLEG): container finished" podID="95acc4bd-d14c-4204-b20e-36085edffb73" containerID="bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d" exitCode=0 Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.186324 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"95acc4bd-d14c-4204-b20e-36085edffb73","Type":"ContainerDied","Data":"bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.186353 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"95acc4bd-d14c-4204-b20e-36085edffb73","Type":"ContainerDied","Data":"24f6f7e98ab943c7ffc938882db060f7a5c265b1fcdf183d0d6348f091468444"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.186370 4763 scope.go:117] "RemoveContainer" containerID="bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.186491 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.189700 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85496b9568-j6pjn" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.189762 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85496b9568-j6pjn" event={"ID":"7909e384-b1c8-476c-801d-8b60015ccdc4","Type":"ContainerDied","Data":"4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.189675 4763 generic.go:334] "Generic (PLEG): container finished" podID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerID="4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431" exitCode=0 Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.190538 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85496b9568-j6pjn" event={"ID":"7909e384-b1c8-476c-801d-8b60015ccdc4","Type":"ContainerDied","Data":"cf63db15a7b55bfdddd784dd5ee14bcf822b33d86707ad079b125332466032d7"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.194486 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14a424ce-ef7d-4b9c-965e-b821798d3f78","Type":"ContainerDied","Data":"96389ebe98abf5578c37fd0a35183b2f424a733cb406605f7b723eaa55cb3668"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.194496 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.195473 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14a424ce-ef7d-4b9c-965e-b821798d3f78" (UID: "14a424ce-ef7d-4b9c-965e-b821798d3f78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.197436 4763 generic.go:334] "Generic (PLEG): container finished" podID="45540131-5bd7-47c8-bab3-da9362ab3aa3" containerID="c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d" exitCode=0 Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.197481 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"45540131-5bd7-47c8-bab3-da9362ab3aa3","Type":"ContainerDied","Data":"c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.197503 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"45540131-5bd7-47c8-bab3-da9362ab3aa3","Type":"ContainerDied","Data":"58f85118fbeb99b50c9ebc06e5f76b2d576ce5d3467a796f3f7d52a786c73ba1"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.197540 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.201206 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b7530761-b715-4178-8d58-5e1cd54838d0/ovn-northd/0.log" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.201246 4763 generic.go:334] "Generic (PLEG): container finished" podID="b7530761-b715-4178-8d58-5e1cd54838d0" containerID="8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8" exitCode=139 Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.201290 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b7530761-b715-4178-8d58-5e1cd54838d0","Type":"ContainerDied","Data":"8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.201315 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b7530761-b715-4178-8d58-5e1cd54838d0","Type":"ContainerDied","Data":"05a1f91ee9e5276b1fa040704a0637760883909173eddc9ab23aac7d8f9463b8"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.201366 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.203988 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95acc4bd-d14c-4204-b20e-36085edffb73" (UID: "95acc4bd-d14c-4204-b20e-36085edffb73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.204520 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d08ec27f-a0b7-4146-8378-8bfb3e460e05","Type":"ContainerDied","Data":"16fb5d9200273d3f09a4302bb3ab7baef0a10a31cae4facb34c852e6d5e82428"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.204640 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.214357 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18b427d1-75e9-4b32-afeb-f895661ddbe1","Type":"ContainerDied","Data":"fe4e317a632f70a76770c9025334e8eabeb0a3254b6ca161eaa753f85e27f82c"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.214437 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.222831 4763 scope.go:117] "RemoveContainer" containerID="bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.225327 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d\": container with ID starting with bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d not found: ID does not exist" containerID="bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.225440 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d"} err="failed to get container status \"bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d\": rpc error: code = NotFound desc = could not find container \"bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d\": container with ID starting with bd5a5d9e87e58c6c169d9bd84a5045b93ec0a00c66de759d12bda26d76ea1b7d not found: ID does not exist" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.225550 4763 scope.go:117] "RemoveContainer" containerID="4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.227939 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0c33271c-af2f-43e4-adaf-9a81ef747ee5","Type":"ContainerDied","Data":"1a14369cae17ccba796f132e19e47253c65fd8c6b6b2c692c0e7018201c68c89"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.228120 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.228357 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-config-data" (OuterVolumeSpecName: "config-data") pod "5c67adc5-b329-4832-a9e6-711a70d0021e" (UID: "5c67adc5-b329-4832-a9e6-711a70d0021e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.229103 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-85496b9568-j6pjn"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.231721 4763 generic.go:334] "Generic (PLEG): container finished" podID="5c67adc5-b329-4832-a9e6-711a70d0021e" containerID="eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab" exitCode=0 Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.231763 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c67adc5-b329-4832-a9e6-711a70d0021e","Type":"ContainerDied","Data":"eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.231783 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c67adc5-b329-4832-a9e6-711a70d0021e","Type":"ContainerDied","Data":"1d1058d2f7ca6c9aac6d91aa892c8622b8de5cddf8a0ed3a9e8c133fb306f6a0"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.231831 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.236861 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5bebf8d6-16bb-4dcf-afac-6a1a55e81350","Type":"ContainerDied","Data":"84962c4492800f166ff497fa724946007caf829226e542df831dcf6fc6d272ae"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.237034 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.237759 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c67adc5-b329-4832-a9e6-711a70d0021e" (UID: "5c67adc5-b329-4832-a9e6-711a70d0021e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.239490 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-85496b9568-j6pjn"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.243554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20b750fb-21cc-4a04-ba58-bddcbc2161e7","Type":"ContainerDied","Data":"7cc312f3062db4c2e6a51b852482e092bc8da0844dc86203230af8ace5e49e52"} Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.243720 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.248175 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "14a424ce-ef7d-4b9c-965e-b821798d3f78" (UID: "14a424ce-ef7d-4b9c-965e-b821798d3f78"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.252776 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "95acc4bd-d14c-4204-b20e-36085edffb73" (UID: "95acc4bd-d14c-4204-b20e-36085edffb73"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.275714 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.275741 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.275753 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.275762 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt5hd\" (UniqueName: \"kubernetes.io/projected/5c67adc5-b329-4832-a9e6-711a70d0021e-kube-api-access-rt5hd\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.275770 4763 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/95acc4bd-d14c-4204-b20e-36085edffb73-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.275779 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.275786 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6t66\" (UniqueName: \"kubernetes.io/projected/14a424ce-ef7d-4b9c-965e-b821798d3f78-kube-api-access-k6t66\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.275795 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c67adc5-b329-4832-a9e6-711a70d0021e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.275813 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.275707 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "b7530761-b715-4178-8d58-5e1cd54838d0" (UID: "b7530761-b715-4178-8d58-5e1cd54838d0"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.276793 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.278366 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-config-data" (OuterVolumeSpecName: "config-data") pod "14a424ce-ef7d-4b9c-965e-b821798d3f78" (UID: "14a424ce-ef7d-4b9c-965e-b821798d3f78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.279344 4763 scope.go:117] "RemoveContainer" containerID="570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.283159 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.290631 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.297325 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.303307 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.309947 4763 scope.go:117] "RemoveContainer" containerID="4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.310459 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431\": container with ID starting with 4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431 not found: ID does not exist" containerID="4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.310496 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431"} err="failed to get container status \"4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431\": rpc error: code = NotFound desc = could not find container \"4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431\": container with ID starting with 4eb1b90a7861f3caf69ae05fce7cf6de659a6ad1eee8f44ba6e2ddec5c737431 not found: ID does not exist" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.310521 4763 scope.go:117] "RemoveContainer" containerID="570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.311152 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263\": container with ID starting with 570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263 not found: ID does not exist" containerID="570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.311176 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263"} err="failed to get container status \"570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263\": rpc error: code = NotFound desc = could not find container \"570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263\": container with ID starting with 570ca7d41313e7070b7607890245c36a10ae1f2f9679c08a2b89123ca0222263 not found: ID does not exist" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.311190 4763 scope.go:117] "RemoveContainer" containerID="08a1f33d966b3e95b4e43e2c4c9605f94fb70073e52950d5c8d680e81f2eeaa1" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.311972 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.322138 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.326899 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.330940 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.339755 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.344144 4763 scope.go:117] "RemoveContainer" containerID="aa4a777788599da63887c8bae5fa04a3d16d36136221168f67e902cc9b4ffdbc" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.346472 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.350932 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.355130 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.371582 4763 scope.go:117] "RemoveContainer" containerID="c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.377076 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsnrq\" (UniqueName: \"kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq\") pod \"keystonede27-account-delete-4rlbx\" (UID: \"824e459b-c900-47ae-ae65-63c4ffb58fcc\") " pod="openstack/keystonede27-account-delete-4rlbx" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.377207 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a424ce-ef7d-4b9c-965e-b821798d3f78-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.377224 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7530761-b715-4178-8d58-5e1cd54838d0-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.377233 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.381564 4763 projected.go:194] Error preparing data for projected volume kube-api-access-vsnrq for pod openstack/keystonede27-account-delete-4rlbx: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.381639 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq podName:824e459b-c900-47ae-ae65-63c4ffb58fcc nodeName:}" failed. No retries permitted until 2025-10-06 15:15:27.381596955 +0000 UTC m=+1324.536889467 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vsnrq" (UniqueName: "kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq") pod "keystonede27-account-delete-4rlbx" (UID: "824e459b-c900-47ae-ae65-63c4ffb58fcc") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.488397 4763 scope.go:117] "RemoveContainer" containerID="c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.507088 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d\": container with ID starting with c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d not found: ID does not exist" containerID="c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.507137 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d"} err="failed to get container status \"c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d\": rpc error: code = NotFound desc = could not find container \"c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d\": container with ID starting with c56045314c38b3e79c03519fc780ac369bcb8612725aca748590cad86e170e9d not found: ID does not exist" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.507170 4763 scope.go:117] "RemoveContainer" containerID="27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.534559 4763 scope.go:117] "RemoveContainer" containerID="8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.574160 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.575964 4763 scope.go:117] "RemoveContainer" containerID="27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.580666 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950\": container with ID starting with 27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950 not found: ID does not exist" containerID="27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.580704 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950"} err="failed to get container status \"27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950\": rpc error: code = NotFound desc = could not find container \"27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950\": container with ID starting with 27369b8c69016b0e4231da3be0b7b9db6595c4f8a78ab5cb0209ab7840565950 not found: ID does not exist" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.580728 4763 scope.go:117] "RemoveContainer" containerID="8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.581113 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8\": container with ID starting with 8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8 not found: ID does not exist" containerID="8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.581136 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8"} err="failed to get container status \"8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8\": rpc error: code = NotFound desc = could not find container \"8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8\": container with ID starting with 8c2d7c9e3d9b7372b55330b1dc2e555543eb78050406d6801bb3fde257de83f8 not found: ID does not exist" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.581152 4763 scope.go:117] "RemoveContainer" containerID="d1eb8ece3ed05071e8f0e183f59d758fa3e0c6899e555329438a62699f5c4813" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.597504 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c33271c-af2f-43e4-adaf-9a81ef747ee5" path="/var/lib/kubelet/pods/0c33271c-af2f-43e4-adaf-9a81ef747ee5/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.598204 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" path="/var/lib/kubelet/pods/18b427d1-75e9-4b32-afeb-f895661ddbe1/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.599172 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" path="/var/lib/kubelet/pods/20b750fb-21cc-4a04-ba58-bddcbc2161e7/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.600244 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2235b0e6-860e-450c-b129-f0082e1670e1" path="/var/lib/kubelet/pods/2235b0e6-860e-450c-b129-f0082e1670e1/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.600796 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45540131-5bd7-47c8-bab3-da9362ab3aa3" path="/var/lib/kubelet/pods/45540131-5bd7-47c8-bab3-da9362ab3aa3/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.605391 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" path="/var/lib/kubelet/pods/5bebf8d6-16bb-4dcf-afac-6a1a55e81350/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.606246 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df38d69-816c-41c3-8de5-b270104ebb23" path="/var/lib/kubelet/pods/5df38d69-816c-41c3-8de5-b270104ebb23/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.607025 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" path="/var/lib/kubelet/pods/7909e384-b1c8-476c-801d-8b60015ccdc4/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.608128 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1611a98-63a9-434a-b45a-164a527fe97e" path="/var/lib/kubelet/pods/a1611a98-63a9-434a-b45a-164a527fe97e/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.608728 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c465d0a4-ce55-49ff-bdd4-62585989b25b" path="/var/lib/kubelet/pods/c465d0a4-ce55-49ff-bdd4-62585989b25b/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.609439 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" path="/var/lib/kubelet/pods/d08ec27f-a0b7-4146-8378-8bfb3e460e05/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.610893 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb2d7b1b-15e9-4aba-89ed-46b37e6a769c" path="/var/lib/kubelet/pods/fb2d7b1b-15e9-4aba-89ed-46b37e6a769c/volumes" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.616680 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.617108 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.621715 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.621861 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.621929 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.621984 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.622036 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.614080 4763 scope.go:117] "RemoveContainer" containerID="8f6a767bdad431305d84e26950c1846ebe85fc9d498b6fa5277ed34f9be13eb0" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.650594 4763 scope.go:117] "RemoveContainer" containerID="3371eb2323b8c35c65358b46ed5e998f521b7d07b05bcf5e5f18ac2589423079" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.684992 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14a424ce_ef7d_4b9c_965e_b821798d3f78.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7530761_b715_4178_8d58_5e1cd54838d0.slice/crio-05a1f91ee9e5276b1fa040704a0637760883909173eddc9ab23aac7d8f9463b8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95acc4bd_d14c_4204_b20e_36085edffb73.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c67adc5_b329_4832_a9e6_711a70d0021e.slice/crio-1d1058d2f7ca6c9aac6d91aa892c8622b8de5cddf8a0ed3a9e8c133fb306f6a0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95acc4bd_d14c_4204_b20e_36085edffb73.slice/crio-24f6f7e98ab943c7ffc938882db060f7a5c265b1fcdf183d0d6348f091468444\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fd7fbde_cddf_41fe_9a6e_6b1cdba389de.slice/crio-906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14a424ce_ef7d_4b9c_965e_b821798d3f78.slice/crio-96389ebe98abf5578c37fd0a35183b2f424a733cb406605f7b723eaa55cb3668\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df38d69_816c_41c3_8de5_b270104ebb23.slice/crio-9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4\": RecentStats: unable to find data in memory cache]" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.711032 4763 scope.go:117] "RemoveContainer" containerID="c218a29eb7d3663d26274e3363b8a572e07d93eb706bdfc4ab5ab9ee292f214b" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.735880 4763 scope.go:117] "RemoveContainer" containerID="b5a74a7863d471d5ebd8edc913311b4b8759f4a34d7919b70ef31a57831def5c" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.774868 4763 scope.go:117] "RemoveContainer" containerID="eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.787158 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.787211 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data podName:2fad9bbe-33dc-4f1d-a156-52bbd3a69273 nodeName:}" failed. No retries permitted until 2025-10-06 15:15:33.787198415 +0000 UTC m=+1330.942490927 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data") pod "rabbitmq-cell1-server-0" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273") : configmap "rabbitmq-cell1-config-data" not found Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.830017 4763 scope.go:117] "RemoveContainer" containerID="eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab" Oct 06 15:15:25 crc kubenswrapper[4763]: E1006 15:15:25.830548 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab\": container with ID starting with eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab not found: ID does not exist" containerID="eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.830797 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab"} err="failed to get container status \"eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab\": rpc error: code = NotFound desc = could not find container \"eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab\": container with ID starting with eb70c8f94c11e43ed875c02d848aa8c0c5616407e6e7e1f487ae3c8779d1bfab not found: ID does not exist" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.830821 4763 scope.go:117] "RemoveContainer" containerID="adcaf8c8d3f281041745bafa4ac6ec897a52c954a8e11f85b8df351c7b2de708" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.861310 4763 scope.go:117] "RemoveContainer" containerID="4f656fbf184d6e7e7f0539b3b3bd12e503563cb2565c711ff5b955951ea221f9" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.876091 4763 scope.go:117] "RemoveContainer" containerID="6e23f0e5ee7363ef074a84ef0fdf33f87ae71f9dfadd96d75eeac46f6868a70c" Oct 06 15:15:25 crc kubenswrapper[4763]: I1006 15:15:25.918822 4763 scope.go:117] "RemoveContainer" containerID="33de3bfd1cb8b6e84aac16172cd6d4816d4bbcc99bd594f7601b9f6bae6bc708" Oct 06 15:15:26 crc kubenswrapper[4763]: E1006 15:15:26.007987 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 15:15:26 crc kubenswrapper[4763]: E1006 15:15:26.009456 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 15:15:26 crc kubenswrapper[4763]: E1006 15:15:26.021510 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 15:15:26 crc kubenswrapper[4763]: E1006 15:15:26.021583 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="bcb3dd44-b7c9-4653-930a-113565fccec1" containerName="nova-cell0-conductor-conductor" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.106119 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.194140 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-generated\") pod \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.194214 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-combined-ca-bundle\") pod \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.194266 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-operator-scripts\") pod \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.194336 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7hfw\" (UniqueName: \"kubernetes.io/projected/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kube-api-access-d7hfw\") pod \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.194447 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kolla-config\") pod \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.194643 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-default\") pod \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.194770 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-galera-tls-certs\") pod \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.194813 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" (UID: "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.194853 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.194899 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-secrets\") pod \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\" (UID: \"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de\") " Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.195161 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" (UID: "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.195223 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" (UID: "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.195245 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" (UID: "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.195543 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.195581 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.195600 4763 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.195643 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.208962 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-secrets" (OuterVolumeSpecName: "secrets") pod "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" (UID: "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.208992 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kube-api-access-d7hfw" (OuterVolumeSpecName: "kube-api-access-d7hfw") pod "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" (UID: "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de"). InnerVolumeSpecName "kube-api-access-d7hfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.223928 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" (UID: "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.229219 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" (UID: "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.238007 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" (UID: "8fd7fbde-cddf-41fe-9a6e-6b1cdba389de"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.264736 4763 generic.go:334] "Generic (PLEG): container finished" podID="8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" containerID="906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080" exitCode=0 Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.264827 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de","Type":"ContainerDied","Data":"906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080"} Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.264849 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.264879 4763 scope.go:117] "RemoveContainer" containerID="906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.264862 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd7fbde-cddf-41fe-9a6e-6b1cdba389de","Type":"ContainerDied","Data":"4c1d21dd34b79f3b673ecb2ff7924594aa3a579f7110e79747f6c190a8b7f833"} Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.288143 4763 scope.go:117] "RemoveContainer" containerID="6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.297184 4763 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.297223 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.297236 4763 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.297245 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.297254 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7hfw\" (UniqueName: \"kubernetes.io/projected/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de-kube-api-access-d7hfw\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.300952 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.304493 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.312973 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.322485 4763 scope.go:117] "RemoveContainer" containerID="906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080" Oct 06 15:15:26 crc kubenswrapper[4763]: E1006 15:15:26.323101 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080\": container with ID starting with 906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080 not found: ID does not exist" containerID="906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.323172 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080"} err="failed to get container status \"906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080\": rpc error: code = NotFound desc = could not find container \"906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080\": container with ID starting with 906604be015c9bdf88ad317c63d36634214b1efcd79c7e04a1c004e8f4553080 not found: ID does not exist" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.323207 4763 scope.go:117] "RemoveContainer" containerID="6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719" Oct 06 15:15:26 crc kubenswrapper[4763]: E1006 15:15:26.323659 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719\": container with ID starting with 6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719 not found: ID does not exist" containerID="6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.323706 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719"} err="failed to get container status \"6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719\": rpc error: code = NotFound desc = could not find container \"6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719\": container with ID starting with 6a1c3214d0f7c9b550508abcaeabb091b29ae8a877a235592553251dba5f3719 not found: ID does not exist" Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.400019 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:26 crc kubenswrapper[4763]: E1006 15:15:26.502083 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 06 15:15:26 crc kubenswrapper[4763]: E1006 15:15:26.502171 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data podName:5c83a4de-f6df-4d0e-9bd0-03cbcb877f43 nodeName:}" failed. No retries permitted until 2025-10-06 15:15:34.502151078 +0000 UTC m=+1331.657443600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data") pod "rabbitmq-server-0" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43") : configmap "rabbitmq-config-data" not found Oct 06 15:15:26 crc kubenswrapper[4763]: I1006 15:15:26.863914 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/keystone-59bd69c9bf-v6zw6" podUID="18ce3abd-750d-48db-a75f-e3a0d44e042d" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.150:5000/v3\": read tcp 10.217.0.2:56236->10.217.0.150:5000: read: connection reset by peer" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.155963 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229284 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229327 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-server-conf\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229406 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-tls\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229434 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-pod-info\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229461 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-erlang-cookie\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229486 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjt25\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-kube-api-access-tjt25\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229509 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-plugins\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229538 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-erlang-cookie-secret\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229558 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-plugins-conf\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.229643 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-confd\") pod \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\" (UID: \"2fad9bbe-33dc-4f1d-a156-52bbd3a69273\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.230353 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.232732 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.233162 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.235639 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.238564 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-kube-api-access-tjt25" (OuterVolumeSpecName: "kube-api-access-tjt25") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "kube-api-access-tjt25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.245860 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-pod-info" (OuterVolumeSpecName: "pod-info") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.246951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.253349 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.255292 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data" (OuterVolumeSpecName: "config-data") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.270068 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.295818 4763 generic.go:334] "Generic (PLEG): container finished" podID="18ce3abd-750d-48db-a75f-e3a0d44e042d" containerID="4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207" exitCode=0 Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.295865 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-59bd69c9bf-v6zw6" event={"ID":"18ce3abd-750d-48db-a75f-e3a0d44e042d","Type":"ContainerDied","Data":"4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207"} Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.295887 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-59bd69c9bf-v6zw6" event={"ID":"18ce3abd-750d-48db-a75f-e3a0d44e042d","Type":"ContainerDied","Data":"6ec565fbcae7b90e50e2cb126f8743cdaca85cbc711207b58eaf1fe128024b13"} Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.295904 4763 scope.go:117] "RemoveContainer" containerID="4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.295984 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-59bd69c9bf-v6zw6" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.300939 4763 generic.go:334] "Generic (PLEG): container finished" podID="2fad9bbe-33dc-4f1d-a156-52bbd3a69273" containerID="32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965" exitCode=0 Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.300976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2fad9bbe-33dc-4f1d-a156-52bbd3a69273","Type":"ContainerDied","Data":"32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965"} Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.300995 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2fad9bbe-33dc-4f1d-a156-52bbd3a69273","Type":"ContainerDied","Data":"125782bda6f71dae1aae108fb18daaeda2d030e4707c1c6a8f51dee433b1b856"} Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.301043 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.305424 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-server-conf" (OuterVolumeSpecName: "server-conf") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.308034 4763 generic.go:334] "Generic (PLEG): container finished" podID="5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" containerID="9dd278deff6663bb1f284e72573e37f876ea35fa390fa6b2c8c631abd30f4c75" exitCode=0 Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.308060 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43","Type":"ContainerDied","Data":"9dd278deff6663bb1f284e72573e37f876ea35fa390fa6b2c8c631abd30f4c75"} Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.329550 4763 scope.go:117] "RemoveContainer" containerID="4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207" Oct 06 15:15:27 crc kubenswrapper[4763]: E1006 15:15:27.331048 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207\": container with ID starting with 4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207 not found: ID does not exist" containerID="4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.331082 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207"} err="failed to get container status \"4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207\": rpc error: code = NotFound desc = could not find container \"4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207\": container with ID starting with 4a3a5e086e25be91263086e5645f0b7770b87e749f6bb6d1a67546594167d207 not found: ID does not exist" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.331101 4763 scope.go:117] "RemoveContainer" containerID="32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.331815 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-fernet-keys\") pod \"18ce3abd-750d-48db-a75f-e3a0d44e042d\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.331898 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-internal-tls-certs\") pod \"18ce3abd-750d-48db-a75f-e3a0d44e042d\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.331942 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-combined-ca-bundle\") pod \"18ce3abd-750d-48db-a75f-e3a0d44e042d\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332008 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-config-data\") pod \"18ce3abd-750d-48db-a75f-e3a0d44e042d\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332043 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-scripts\") pod \"18ce3abd-750d-48db-a75f-e3a0d44e042d\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332096 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-public-tls-certs\") pod \"18ce3abd-750d-48db-a75f-e3a0d44e042d\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332141 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-credential-keys\") pod \"18ce3abd-750d-48db-a75f-e3a0d44e042d\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332186 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qb6h\" (UniqueName: \"kubernetes.io/projected/18ce3abd-750d-48db-a75f-e3a0d44e042d-kube-api-access-6qb6h\") pod \"18ce3abd-750d-48db-a75f-e3a0d44e042d\" (UID: \"18ce3abd-750d-48db-a75f-e3a0d44e042d\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332559 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332587 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332597 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332606 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332632 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332641 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332649 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjt25\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-kube-api-access-tjt25\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332657 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332665 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.332674 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.334524 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "18ce3abd-750d-48db-a75f-e3a0d44e042d" (UID: "18ce3abd-750d-48db-a75f-e3a0d44e042d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.336180 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-scripts" (OuterVolumeSpecName: "scripts") pod "18ce3abd-750d-48db-a75f-e3a0d44e042d" (UID: "18ce3abd-750d-48db-a75f-e3a0d44e042d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.337193 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "18ce3abd-750d-48db-a75f-e3a0d44e042d" (UID: "18ce3abd-750d-48db-a75f-e3a0d44e042d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.343254 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ce3abd-750d-48db-a75f-e3a0d44e042d-kube-api-access-6qb6h" (OuterVolumeSpecName: "kube-api-access-6qb6h") pod "18ce3abd-750d-48db-a75f-e3a0d44e042d" (UID: "18ce3abd-750d-48db-a75f-e3a0d44e042d"). InnerVolumeSpecName "kube-api-access-6qb6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.346401 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2fad9bbe-33dc-4f1d-a156-52bbd3a69273" (UID: "2fad9bbe-33dc-4f1d-a156-52bbd3a69273"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.354449 4763 scope.go:117] "RemoveContainer" containerID="afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.367124 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18ce3abd-750d-48db-a75f-e3a0d44e042d" (UID: "18ce3abd-750d-48db-a75f-e3a0d44e042d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.374887 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.379724 4763 scope.go:117] "RemoveContainer" containerID="32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965" Oct 06 15:15:27 crc kubenswrapper[4763]: E1006 15:15:27.380353 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965\": container with ID starting with 32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965 not found: ID does not exist" containerID="32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.380391 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965"} err="failed to get container status \"32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965\": rpc error: code = NotFound desc = could not find container \"32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965\": container with ID starting with 32d93e1a372b43ee7939bb33f6f959a151b9ef352537b03ef23571373eef7965 not found: ID does not exist" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.380411 4763 scope.go:117] "RemoveContainer" containerID="afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76" Oct 06 15:15:27 crc kubenswrapper[4763]: E1006 15:15:27.381452 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76\": container with ID starting with afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76 not found: ID does not exist" containerID="afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.381499 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76"} err="failed to get container status \"afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76\": rpc error: code = NotFound desc = could not find container \"afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76\": container with ID starting with afc34e20166643cc79d6a16091588733629468e56648a79b4f5909a4ea23bd76 not found: ID does not exist" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.384735 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-config-data" (OuterVolumeSpecName: "config-data") pod "18ce3abd-750d-48db-a75f-e3a0d44e042d" (UID: "18ce3abd-750d-48db-a75f-e3a0d44e042d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.394762 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "18ce3abd-750d-48db-a75f-e3a0d44e042d" (UID: "18ce3abd-750d-48db-a75f-e3a0d44e042d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.397799 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "18ce3abd-750d-48db-a75f-e3a0d44e042d" (UID: "18ce3abd-750d-48db-a75f-e3a0d44e042d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434323 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsnrq\" (UniqueName: \"kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq\") pod \"keystonede27-account-delete-4rlbx\" (UID: \"824e459b-c900-47ae-ae65-63c4ffb58fcc\") " pod="openstack/keystonede27-account-delete-4rlbx" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434458 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434472 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434481 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434489 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434500 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2fad9bbe-33dc-4f1d-a156-52bbd3a69273-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434508 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434516 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434526 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qb6h\" (UniqueName: \"kubernetes.io/projected/18ce3abd-750d-48db-a75f-e3a0d44e042d-kube-api-access-6qb6h\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434536 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.434544 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18ce3abd-750d-48db-a75f-e3a0d44e042d-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: E1006 15:15:27.437711 4763 projected.go:194] Error preparing data for projected volume kube-api-access-vsnrq for pod openstack/keystonede27-account-delete-4rlbx: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 06 15:15:27 crc kubenswrapper[4763]: E1006 15:15:27.437781 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq podName:824e459b-c900-47ae-ae65-63c4ffb58fcc nodeName:}" failed. No retries permitted until 2025-10-06 15:15:31.437763622 +0000 UTC m=+1328.593056134 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vsnrq" (UniqueName: "kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq") pod "keystonede27-account-delete-4rlbx" (UID: "824e459b-c900-47ae-ae65-63c4ffb58fcc") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.455971 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535372 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-erlang-cookie\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535475 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535507 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-plugins-conf\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535533 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-plugins\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535563 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-tls\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535598 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-pod-info\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535675 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-server-conf\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535708 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-confd\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535763 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535785 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-erlang-cookie-secret\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535813 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4ht\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-kube-api-access-2d4ht\") pod \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\" (UID: \"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43\") " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535904 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.535939 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.536374 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.536399 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.537196 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.538627 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-kube-api-access-2d4ht" (OuterVolumeSpecName: "kube-api-access-2d4ht") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "kube-api-access-2d4ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.539259 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.539999 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-pod-info" (OuterVolumeSpecName: "pod-info") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.540384 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.541739 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.554536 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data" (OuterVolumeSpecName: "config-data") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.583673 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a424ce-ef7d-4b9c-965e-b821798d3f78" path="/var/lib/kubelet/pods/14a424ce-ef7d-4b9c-965e-b821798d3f78/volumes" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.584333 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c67adc5-b329-4832-a9e6-711a70d0021e" path="/var/lib/kubelet/pods/5c67adc5-b329-4832-a9e6-711a70d0021e/volumes" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.584894 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" path="/var/lib/kubelet/pods/8fd7fbde-cddf-41fe-9a6e-6b1cdba389de/volumes" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.585872 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95acc4bd-d14c-4204-b20e-36085edffb73" path="/var/lib/kubelet/pods/95acc4bd-d14c-4204-b20e-36085edffb73/volumes" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.586409 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7530761-b715-4178-8d58-5e1cd54838d0" path="/var/lib/kubelet/pods/b7530761-b715-4178-8d58-5e1cd54838d0/volumes" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.589146 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-server-conf" (OuterVolumeSpecName: "server-conf") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.637201 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4ht\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-kube-api-access-2d4ht\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.637226 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.637236 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.637247 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.637255 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.637263 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.637282 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.637291 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.641571 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" (UID: "5c83a4de-f6df-4d0e-9bd0-03cbcb877f43"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.644575 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.650919 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.654926 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.662733 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-59bd69c9bf-v6zw6"] Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.667908 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-59bd69c9bf-v6zw6"] Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.738806 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:27 crc kubenswrapper[4763]: I1006 15:15:27.738834 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.326307 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5c83a4de-f6df-4d0e-9bd0-03cbcb877f43","Type":"ContainerDied","Data":"260709a27efd28981c554ca791871e2e0beeb8f26d1c602be4aa85acd764dd72"} Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.326349 4763 scope.go:117] "RemoveContainer" containerID="9dd278deff6663bb1f284e72573e37f876ea35fa390fa6b2c8c631abd30f4c75" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.326471 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.392384 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.398472 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.415235 4763 scope.go:117] "RemoveContainer" containerID="e44c7c112f2e0d0aa4ddbe5721b5449d3464205575c5a4821512c86f3f926b10" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.490271 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-qlkzb"] Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.502424 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-qlkzb"] Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.517887 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-de27-account-create-jl2kv"] Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.526901 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-de27-account-create-jl2kv"] Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.526958 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystonede27-account-delete-4rlbx"] Oct 06 15:15:28 crc kubenswrapper[4763]: E1006 15:15:28.527388 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-vsnrq], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystonede27-account-delete-4rlbx" podUID="824e459b-c900-47ae-ae65-63c4ffb58fcc" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.759549 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.855730 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phb8l\" (UniqueName: \"kubernetes.io/projected/bcb3dd44-b7c9-4653-930a-113565fccec1-kube-api-access-phb8l\") pod \"bcb3dd44-b7c9-4653-930a-113565fccec1\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.857013 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-combined-ca-bundle\") pod \"bcb3dd44-b7c9-4653-930a-113565fccec1\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.857124 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-config-data\") pod \"bcb3dd44-b7c9-4653-930a-113565fccec1\" (UID: \"bcb3dd44-b7c9-4653-930a-113565fccec1\") " Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.862396 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb3dd44-b7c9-4653-930a-113565fccec1-kube-api-access-phb8l" (OuterVolumeSpecName: "kube-api-access-phb8l") pod "bcb3dd44-b7c9-4653-930a-113565fccec1" (UID: "bcb3dd44-b7c9-4653-930a-113565fccec1"). InnerVolumeSpecName "kube-api-access-phb8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.878066 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-config-data" (OuterVolumeSpecName: "config-data") pod "bcb3dd44-b7c9-4653-930a-113565fccec1" (UID: "bcb3dd44-b7c9-4653-930a-113565fccec1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.892377 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcb3dd44-b7c9-4653-930a-113565fccec1" (UID: "bcb3dd44-b7c9-4653-930a-113565fccec1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.959108 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phb8l\" (UniqueName: \"kubernetes.io/projected/bcb3dd44-b7c9-4653-930a-113565fccec1-kube-api-access-phb8l\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.959162 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:28 crc kubenswrapper[4763]: I1006 15:15:28.959180 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb3dd44-b7c9-4653-930a-113565fccec1-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:29 crc kubenswrapper[4763]: E1006 15:15:29.204354 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:29 crc kubenswrapper[4763]: E1006 15:15:29.204881 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:29 crc kubenswrapper[4763]: E1006 15:15:29.205536 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:29 crc kubenswrapper[4763]: E1006 15:15:29.205587 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" Oct 06 15:15:29 crc kubenswrapper[4763]: E1006 15:15:29.205653 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:29 crc kubenswrapper[4763]: E1006 15:15:29.207601 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:29 crc kubenswrapper[4763]: E1006 15:15:29.210025 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:29 crc kubenswrapper[4763]: E1006 15:15:29.210062 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovs-vswitchd" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.341464 4763 generic.go:334] "Generic (PLEG): container finished" podID="bcb3dd44-b7c9-4653-930a-113565fccec1" containerID="a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" exitCode=0 Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.341569 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcb3dd44-b7c9-4653-930a-113565fccec1","Type":"ContainerDied","Data":"a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773"} Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.341594 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.341653 4763 scope.go:117] "RemoveContainer" containerID="a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.341607 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcb3dd44-b7c9-4653-930a-113565fccec1","Type":"ContainerDied","Data":"d0fb7d133d0c3b61106d7b6aa1d97eb16a0c44b0a5e1c7aeee243b0c8b5ca131"} Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.349298 4763 generic.go:334] "Generic (PLEG): container finished" podID="78411411-8959-4af9-9396-864a5dc9f0b1" containerID="c4dfdcfa2f39a9eade765b9d6b125e93d688c2a02a76e0cac1864c972b1b1c39" exitCode=0 Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.349362 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonede27-account-delete-4rlbx" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.349600 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78411411-8959-4af9-9396-864a5dc9f0b1","Type":"ContainerDied","Data":"c4dfdcfa2f39a9eade765b9d6b125e93d688c2a02a76e0cac1864c972b1b1c39"} Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.366817 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonede27-account-delete-4rlbx" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.386070 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.393290 4763 scope.go:117] "RemoveContainer" containerID="a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" Oct 06 15:15:29 crc kubenswrapper[4763]: E1006 15:15:29.393902 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773\": container with ID starting with a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773 not found: ID does not exist" containerID="a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.393949 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773"} err="failed to get container status \"a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773\": rpc error: code = NotFound desc = could not find container \"a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773\": container with ID starting with a1a427fb6df1e7a73fe730b7625a8fbbaf32bbe441f1f4fdfd0f1562cab1d773 not found: ID does not exist" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.395320 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.540776 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.587887 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8ce03a-8961-4c2d-858f-1d808f76c115" path="/var/lib/kubelet/pods/0a8ce03a-8961-4c2d-858f-1d808f76c115/volumes" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.589196 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ce3abd-750d-48db-a75f-e3a0d44e042d" path="/var/lib/kubelet/pods/18ce3abd-750d-48db-a75f-e3a0d44e042d/volumes" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.590267 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="286e2cc3-89f5-4c6b-bf84-044f889676f8" path="/var/lib/kubelet/pods/286e2cc3-89f5-4c6b-bf84-044f889676f8/volumes" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.593514 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fad9bbe-33dc-4f1d-a156-52bbd3a69273" path="/var/lib/kubelet/pods/2fad9bbe-33dc-4f1d-a156-52bbd3a69273/volumes" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.596113 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" path="/var/lib/kubelet/pods/5c83a4de-f6df-4d0e-9bd0-03cbcb877f43/volumes" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.598704 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb3dd44-b7c9-4653-930a-113565fccec1" path="/var/lib/kubelet/pods/bcb3dd44-b7c9-4653-930a-113565fccec1/volumes" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.670333 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-combined-ca-bundle\") pod \"78411411-8959-4af9-9396-864a5dc9f0b1\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.670420 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-sg-core-conf-yaml\") pod \"78411411-8959-4af9-9396-864a5dc9f0b1\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.671173 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-run-httpd\") pod \"78411411-8959-4af9-9396-864a5dc9f0b1\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.671213 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztb8l\" (UniqueName: \"kubernetes.io/projected/78411411-8959-4af9-9396-864a5dc9f0b1-kube-api-access-ztb8l\") pod \"78411411-8959-4af9-9396-864a5dc9f0b1\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.671252 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-ceilometer-tls-certs\") pod \"78411411-8959-4af9-9396-864a5dc9f0b1\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.671384 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-config-data\") pod \"78411411-8959-4af9-9396-864a5dc9f0b1\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.671412 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-scripts\") pod \"78411411-8959-4af9-9396-864a5dc9f0b1\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.671440 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-log-httpd\") pod \"78411411-8959-4af9-9396-864a5dc9f0b1\" (UID: \"78411411-8959-4af9-9396-864a5dc9f0b1\") " Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.671668 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "78411411-8959-4af9-9396-864a5dc9f0b1" (UID: "78411411-8959-4af9-9396-864a5dc9f0b1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.672092 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.672309 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "78411411-8959-4af9-9396-864a5dc9f0b1" (UID: "78411411-8959-4af9-9396-864a5dc9f0b1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.675459 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-scripts" (OuterVolumeSpecName: "scripts") pod "78411411-8959-4af9-9396-864a5dc9f0b1" (UID: "78411411-8959-4af9-9396-864a5dc9f0b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.680179 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78411411-8959-4af9-9396-864a5dc9f0b1-kube-api-access-ztb8l" (OuterVolumeSpecName: "kube-api-access-ztb8l") pod "78411411-8959-4af9-9396-864a5dc9f0b1" (UID: "78411411-8959-4af9-9396-864a5dc9f0b1"). InnerVolumeSpecName "kube-api-access-ztb8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.700000 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "78411411-8959-4af9-9396-864a5dc9f0b1" (UID: "78411411-8959-4af9-9396-864a5dc9f0b1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.719850 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "78411411-8959-4af9-9396-864a5dc9f0b1" (UID: "78411411-8959-4af9-9396-864a5dc9f0b1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.733660 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78411411-8959-4af9-9396-864a5dc9f0b1" (UID: "78411411-8959-4af9-9396-864a5dc9f0b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.770682 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-config-data" (OuterVolumeSpecName: "config-data") pod "78411411-8959-4af9-9396-864a5dc9f0b1" (UID: "78411411-8959-4af9-9396-864a5dc9f0b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.773589 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.773648 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.773664 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztb8l\" (UniqueName: \"kubernetes.io/projected/78411411-8959-4af9-9396-864a5dc9f0b1-kube-api-access-ztb8l\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.773681 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.773693 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.773706 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78411411-8959-4af9-9396-864a5dc9f0b1-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:29 crc kubenswrapper[4763]: I1006 15:15:29.773717 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78411411-8959-4af9-9396-864a5dc9f0b1-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.367695 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonede27-account-delete-4rlbx" Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.368538 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.368805 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78411411-8959-4af9-9396-864a5dc9f0b1","Type":"ContainerDied","Data":"f799d8b5debb8278650ae1752c19192f7332b2feea64dba5b2b7c380b0cbc8d7"} Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.368898 4763 scope.go:117] "RemoveContainer" containerID="c09071984a4aec419c3daa2a55dbf3605ca256c2a5199a60850abcd7fd764271" Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.442340 4763 scope.go:117] "RemoveContainer" containerID="b7afa41ebd51bd83378246bbfccfbc1dec4a81fdcfd2d71bd611a3acb01ef475" Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.463827 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.477811 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.481341 4763 scope.go:117] "RemoveContainer" containerID="c4dfdcfa2f39a9eade765b9d6b125e93d688c2a02a76e0cac1864c972b1b1c39" Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.490581 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystonede27-account-delete-4rlbx"] Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.495866 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystonede27-account-delete-4rlbx"] Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.504307 4763 scope.go:117] "RemoveContainer" containerID="08f39e51da65908a2dcee840d82e7c6b6fa5d8d45e73728b189ebdf1b5b89b0d" Oct 06 15:15:30 crc kubenswrapper[4763]: I1006 15:15:30.589858 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsnrq\" (UniqueName: \"kubernetes.io/projected/824e459b-c900-47ae-ae65-63c4ffb58fcc-kube-api-access-vsnrq\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.382499 4763 generic.go:334] "Generic (PLEG): container finished" podID="93a939be-54f9-4483-b37c-57e6d5b04f0d" containerID="e605b0aaa421d3c856e90bbcb0d9a8126bc1d053474a70e3d68b5174771747d5" exitCode=0 Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.382574 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b667cdf65-js4pw" event={"ID":"93a939be-54f9-4483-b37c-57e6d5b04f0d","Type":"ContainerDied","Data":"e605b0aaa421d3c856e90bbcb0d9a8126bc1d053474a70e3d68b5174771747d5"} Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.560953 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.600381 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" path="/var/lib/kubelet/pods/78411411-8959-4af9-9396-864a5dc9f0b1/volumes" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.601137 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="824e459b-c900-47ae-ae65-63c4ffb58fcc" path="/var/lib/kubelet/pods/824e459b-c900-47ae-ae65-63c4ffb58fcc/volumes" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.707508 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-config\") pod \"93a939be-54f9-4483-b37c-57e6d5b04f0d\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.707631 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4gqw\" (UniqueName: \"kubernetes.io/projected/93a939be-54f9-4483-b37c-57e6d5b04f0d-kube-api-access-n4gqw\") pod \"93a939be-54f9-4483-b37c-57e6d5b04f0d\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.707675 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-ovndb-tls-certs\") pod \"93a939be-54f9-4483-b37c-57e6d5b04f0d\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.707779 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-combined-ca-bundle\") pod \"93a939be-54f9-4483-b37c-57e6d5b04f0d\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.707826 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-public-tls-certs\") pod \"93a939be-54f9-4483-b37c-57e6d5b04f0d\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.707873 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-httpd-config\") pod \"93a939be-54f9-4483-b37c-57e6d5b04f0d\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.707930 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-internal-tls-certs\") pod \"93a939be-54f9-4483-b37c-57e6d5b04f0d\" (UID: \"93a939be-54f9-4483-b37c-57e6d5b04f0d\") " Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.713685 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a939be-54f9-4483-b37c-57e6d5b04f0d-kube-api-access-n4gqw" (OuterVolumeSpecName: "kube-api-access-n4gqw") pod "93a939be-54f9-4483-b37c-57e6d5b04f0d" (UID: "93a939be-54f9-4483-b37c-57e6d5b04f0d"). InnerVolumeSpecName "kube-api-access-n4gqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.714024 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "93a939be-54f9-4483-b37c-57e6d5b04f0d" (UID: "93a939be-54f9-4483-b37c-57e6d5b04f0d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.759326 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "93a939be-54f9-4483-b37c-57e6d5b04f0d" (UID: "93a939be-54f9-4483-b37c-57e6d5b04f0d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.770097 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "93a939be-54f9-4483-b37c-57e6d5b04f0d" (UID: "93a939be-54f9-4483-b37c-57e6d5b04f0d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.778703 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93a939be-54f9-4483-b37c-57e6d5b04f0d" (UID: "93a939be-54f9-4483-b37c-57e6d5b04f0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.781705 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-config" (OuterVolumeSpecName: "config") pod "93a939be-54f9-4483-b37c-57e6d5b04f0d" (UID: "93a939be-54f9-4483-b37c-57e6d5b04f0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.786369 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "93a939be-54f9-4483-b37c-57e6d5b04f0d" (UID: "93a939be-54f9-4483-b37c-57e6d5b04f0d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.810253 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.810302 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.810318 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-config\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.810328 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4gqw\" (UniqueName: \"kubernetes.io/projected/93a939be-54f9-4483-b37c-57e6d5b04f0d-kube-api-access-n4gqw\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.810339 4763 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.810348 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:31 crc kubenswrapper[4763]: I1006 15:15:31.810355 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a939be-54f9-4483-b37c-57e6d5b04f0d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:32 crc kubenswrapper[4763]: I1006 15:15:32.396280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b667cdf65-js4pw" event={"ID":"93a939be-54f9-4483-b37c-57e6d5b04f0d","Type":"ContainerDied","Data":"6b9d2fcdd5150f0c45d912573a585533611f5c6dd30221d1e73989b2a7ac1ff4"} Oct 06 15:15:32 crc kubenswrapper[4763]: I1006 15:15:32.396361 4763 scope.go:117] "RemoveContainer" containerID="0ff5b5a9d760f2d9ecf1a7c7038e9965e60f8bcb7447f920abe0567b0a54badf" Oct 06 15:15:32 crc kubenswrapper[4763]: I1006 15:15:32.397465 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b667cdf65-js4pw" Oct 06 15:15:32 crc kubenswrapper[4763]: I1006 15:15:32.426448 4763 scope.go:117] "RemoveContainer" containerID="e605b0aaa421d3c856e90bbcb0d9a8126bc1d053474a70e3d68b5174771747d5" Oct 06 15:15:32 crc kubenswrapper[4763]: I1006 15:15:32.449828 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b667cdf65-js4pw"] Oct 06 15:15:32 crc kubenswrapper[4763]: I1006 15:15:32.457811 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b667cdf65-js4pw"] Oct 06 15:15:33 crc kubenswrapper[4763]: I1006 15:15:33.604286 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93a939be-54f9-4483-b37c-57e6d5b04f0d" path="/var/lib/kubelet/pods/93a939be-54f9-4483-b37c-57e6d5b04f0d/volumes" Oct 06 15:15:34 crc kubenswrapper[4763]: E1006 15:15:34.203182 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:34 crc kubenswrapper[4763]: E1006 15:15:34.203781 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:34 crc kubenswrapper[4763]: E1006 15:15:34.204085 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:34 crc kubenswrapper[4763]: E1006 15:15:34.204117 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" Oct 06 15:15:34 crc kubenswrapper[4763]: E1006 15:15:34.204532 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:34 crc kubenswrapper[4763]: E1006 15:15:34.206593 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:34 crc kubenswrapper[4763]: E1006 15:15:34.208087 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:34 crc kubenswrapper[4763]: E1006 15:15:34.208123 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovs-vswitchd" Oct 06 15:15:35 crc kubenswrapper[4763]: E1006 15:15:35.881688 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df38d69_816c_41c3_8de5_b270104ebb23.slice/crio-9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4\": RecentStats: unable to find data in memory cache]" Oct 06 15:15:39 crc kubenswrapper[4763]: E1006 15:15:39.203143 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:39 crc kubenswrapper[4763]: E1006 15:15:39.204296 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:39 crc kubenswrapper[4763]: E1006 15:15:39.204743 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:39 crc kubenswrapper[4763]: E1006 15:15:39.204821 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" Oct 06 15:15:39 crc kubenswrapper[4763]: E1006 15:15:39.205248 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:39 crc kubenswrapper[4763]: E1006 15:15:39.207116 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:39 crc kubenswrapper[4763]: E1006 15:15:39.209236 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:39 crc kubenswrapper[4763]: E1006 15:15:39.209298 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovs-vswitchd" Oct 06 15:15:44 crc kubenswrapper[4763]: E1006 15:15:44.203870 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:44 crc kubenswrapper[4763]: E1006 15:15:44.204916 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:44 crc kubenswrapper[4763]: E1006 15:15:44.205233 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:44 crc kubenswrapper[4763]: E1006 15:15:44.205261 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:44 crc kubenswrapper[4763]: E1006 15:15:44.205285 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" Oct 06 15:15:44 crc kubenswrapper[4763]: E1006 15:15:44.206731 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:44 crc kubenswrapper[4763]: E1006 15:15:44.208207 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:44 crc kubenswrapper[4763]: E1006 15:15:44.208239 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovs-vswitchd" Oct 06 15:15:46 crc kubenswrapper[4763]: E1006 15:15:46.081012 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df38d69_816c_41c3_8de5_b270104ebb23.slice/crio-9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4\": RecentStats: unable to find data in memory cache]" Oct 06 15:15:49 crc kubenswrapper[4763]: E1006 15:15:49.204005 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:49 crc kubenswrapper[4763]: E1006 15:15:49.204039 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd is running failed: container process not found" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:49 crc kubenswrapper[4763]: E1006 15:15:49.205262 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:49 crc kubenswrapper[4763]: E1006 15:15:49.205569 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd is running failed: container process not found" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:49 crc kubenswrapper[4763]: E1006 15:15:49.205999 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 15:15:49 crc kubenswrapper[4763]: E1006 15:15:49.206038 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" Oct 06 15:15:49 crc kubenswrapper[4763]: E1006 15:15:49.206115 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd is running failed: container process not found" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 15:15:49 crc kubenswrapper[4763]: E1006 15:15:49.206205 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cf4dn" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovs-vswitchd" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.604190 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cf4dn_d14df013-8cb0-4f11-b69d-a52002788320/ovs-vswitchd/0.log" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.605055 4763 generic.go:334] "Generic (PLEG): container finished" podID="d14df013-8cb0-4f11-b69d-a52002788320" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" exitCode=137 Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.605125 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cf4dn" event={"ID":"d14df013-8cb0-4f11-b69d-a52002788320","Type":"ContainerDied","Data":"645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd"} Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.669468 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cf4dn_d14df013-8cb0-4f11-b69d-a52002788320/ovs-vswitchd/0.log" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.670055 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.794484 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d14df013-8cb0-4f11-b69d-a52002788320-scripts\") pod \"d14df013-8cb0-4f11-b69d-a52002788320\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.794547 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dznqd\" (UniqueName: \"kubernetes.io/projected/d14df013-8cb0-4f11-b69d-a52002788320-kube-api-access-dznqd\") pod \"d14df013-8cb0-4f11-b69d-a52002788320\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.794598 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-run\") pod \"d14df013-8cb0-4f11-b69d-a52002788320\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.794701 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-run" (OuterVolumeSpecName: "var-run") pod "d14df013-8cb0-4f11-b69d-a52002788320" (UID: "d14df013-8cb0-4f11-b69d-a52002788320"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.794780 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-lib\") pod \"d14df013-8cb0-4f11-b69d-a52002788320\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.794898 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-lib" (OuterVolumeSpecName: "var-lib") pod "d14df013-8cb0-4f11-b69d-a52002788320" (UID: "d14df013-8cb0-4f11-b69d-a52002788320"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.794981 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-log\") pod \"d14df013-8cb0-4f11-b69d-a52002788320\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.795054 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-log" (OuterVolumeSpecName: "var-log") pod "d14df013-8cb0-4f11-b69d-a52002788320" (UID: "d14df013-8cb0-4f11-b69d-a52002788320"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.795087 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-etc-ovs\") pod \"d14df013-8cb0-4f11-b69d-a52002788320\" (UID: \"d14df013-8cb0-4f11-b69d-a52002788320\") " Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.795167 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "d14df013-8cb0-4f11-b69d-a52002788320" (UID: "d14df013-8cb0-4f11-b69d-a52002788320"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.795582 4763 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-log\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.795631 4763 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.795649 4763 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.795665 4763 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d14df013-8cb0-4f11-b69d-a52002788320-var-lib\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.796486 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d14df013-8cb0-4f11-b69d-a52002788320-scripts" (OuterVolumeSpecName: "scripts") pod "d14df013-8cb0-4f11-b69d-a52002788320" (UID: "d14df013-8cb0-4f11-b69d-a52002788320"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.805497 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14df013-8cb0-4f11-b69d-a52002788320-kube-api-access-dznqd" (OuterVolumeSpecName: "kube-api-access-dznqd") pod "d14df013-8cb0-4f11-b69d-a52002788320" (UID: "d14df013-8cb0-4f11-b69d-a52002788320"). InnerVolumeSpecName "kube-api-access-dznqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.897522 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d14df013-8cb0-4f11-b69d-a52002788320-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:49 crc kubenswrapper[4763]: I1006 15:15:49.897838 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dznqd\" (UniqueName: \"kubernetes.io/projected/d14df013-8cb0-4f11-b69d-a52002788320-kube-api-access-dznqd\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.297097 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.309853 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.403682 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjn69\" (UniqueName: \"kubernetes.io/projected/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-kube-api-access-bjn69\") pod \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.403748 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data-custom\") pod \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.403779 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift\") pod \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.403801 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-combined-ca-bundle\") pod \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.403830 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-lock\") pod \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.403851 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-etc-machine-id\") pod \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.403903 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-scripts\") pod \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.403946 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.403985 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h449c\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-kube-api-access-h449c\") pod \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.404017 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-cache\") pod \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\" (UID: \"84d1d27d-b811-4100-9366-b71d6ae0f4a0\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.404067 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data\") pod \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\" (UID: \"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa\") " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.404276 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" (UID: "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.404848 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-lock" (OuterVolumeSpecName: "lock") pod "84d1d27d-b811-4100-9366-b71d6ae0f4a0" (UID: "84d1d27d-b811-4100-9366-b71d6ae0f4a0"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.404972 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-cache" (OuterVolumeSpecName: "cache") pod "84d1d27d-b811-4100-9366-b71d6ae0f4a0" (UID: "84d1d27d-b811-4100-9366-b71d6ae0f4a0"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.407468 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-kube-api-access-h449c" (OuterVolumeSpecName: "kube-api-access-h449c") pod "84d1d27d-b811-4100-9366-b71d6ae0f4a0" (UID: "84d1d27d-b811-4100-9366-b71d6ae0f4a0"). InnerVolumeSpecName "kube-api-access-h449c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.407867 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "swift") pod "84d1d27d-b811-4100-9366-b71d6ae0f4a0" (UID: "84d1d27d-b811-4100-9366-b71d6ae0f4a0"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.408214 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-kube-api-access-bjn69" (OuterVolumeSpecName: "kube-api-access-bjn69") pod "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" (UID: "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa"). InnerVolumeSpecName "kube-api-access-bjn69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.408702 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" (UID: "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.408721 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-scripts" (OuterVolumeSpecName: "scripts") pod "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" (UID: "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.409827 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "84d1d27d-b811-4100-9366-b71d6ae0f4a0" (UID: "84d1d27d-b811-4100-9366-b71d6ae0f4a0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.440425 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" (UID: "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.469958 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data" (OuterVolumeSpecName: "config-data") pod "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" (UID: "1ed9fd34-3ac8-4420-958a-d4d41f7c83fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507592 4763 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-lock\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507670 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507689 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507734 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507749 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h449c\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-kube-api-access-h449c\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507763 4763 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/84d1d27d-b811-4100-9366-b71d6ae0f4a0-cache\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507775 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507787 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjn69\" (UniqueName: \"kubernetes.io/projected/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-kube-api-access-bjn69\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507798 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507810 4763 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/84d1d27d-b811-4100-9366-b71d6ae0f4a0-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.507821 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.534013 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.610763 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.618863 4763 generic.go:334] "Generic (PLEG): container finished" podID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" containerID="8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428" exitCode=137 Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.618959 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.618937 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa","Type":"ContainerDied","Data":"8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428"} Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.619168 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ed9fd34-3ac8-4420-958a-d4d41f7c83fa","Type":"ContainerDied","Data":"043ddb8ebba5c4c9199d8e7a668f4e3a49d03de335508579b67071918c00caa6"} Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.619202 4763 scope.go:117] "RemoveContainer" containerID="23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.622231 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cf4dn_d14df013-8cb0-4f11-b69d-a52002788320/ovs-vswitchd/0.log" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.624697 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cf4dn" event={"ID":"d14df013-8cb0-4f11-b69d-a52002788320","Type":"ContainerDied","Data":"474c8ae0c0ae7a3e36727f0aaec390fcdb9eadd6fb0a89930415b9183f155250"} Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.625057 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cf4dn" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.637080 4763 generic.go:334] "Generic (PLEG): container finished" podID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerID="45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1" exitCode=137 Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.637137 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1"} Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.637178 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"84d1d27d-b811-4100-9366-b71d6ae0f4a0","Type":"ContainerDied","Data":"e97644fc94ac473bf028f6ed61b44913b5e8b70072987fa1c96eac4af0a82575"} Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.637690 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.653124 4763 scope.go:117] "RemoveContainer" containerID="8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.671029 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-cf4dn"] Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.681406 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-cf4dn"] Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.690564 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.697513 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.709551 4763 scope.go:117] "RemoveContainer" containerID="23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50" Oct 06 15:15:50 crc kubenswrapper[4763]: E1006 15:15:50.710260 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50\": container with ID starting with 23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50 not found: ID does not exist" containerID="23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.710318 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50"} err="failed to get container status \"23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50\": rpc error: code = NotFound desc = could not find container \"23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50\": container with ID starting with 23b123a5fc6ae0dc4d1fa65967cbe283e6a47f96eeedf0095af8d1f8290a1e50 not found: ID does not exist" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.710353 4763 scope.go:117] "RemoveContainer" containerID="8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428" Oct 06 15:15:50 crc kubenswrapper[4763]: E1006 15:15:50.710904 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428\": container with ID starting with 8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428 not found: ID does not exist" containerID="8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.710949 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428"} err="failed to get container status \"8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428\": rpc error: code = NotFound desc = could not find container \"8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428\": container with ID starting with 8f447e69bdeb4d8e00ed3b4c50dc0478d5f7565c7252b0048bb077cb379e6428 not found: ID does not exist" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.711023 4763 scope.go:117] "RemoveContainer" containerID="645bc81dc8cd0837a29665df5b1756d628a00dda0a03da36538d91ff9e5cb8bd" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.714034 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.720239 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.742798 4763 scope.go:117] "RemoveContainer" containerID="0fbb142bb11e3a16bf1b6d8702f3fd45540174f6aab1e3d394cd598ad402ea86" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.779750 4763 scope.go:117] "RemoveContainer" containerID="1d3e6fcc20427b0cfe161638b80bcdb635dc69220c8ec3c37520d536e807fc49" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.808163 4763 scope.go:117] "RemoveContainer" containerID="45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.836680 4763 scope.go:117] "RemoveContainer" containerID="2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.863979 4763 scope.go:117] "RemoveContainer" containerID="4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.889559 4763 scope.go:117] "RemoveContainer" containerID="d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.934144 4763 scope.go:117] "RemoveContainer" containerID="cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.968180 4763 scope.go:117] "RemoveContainer" containerID="9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9" Oct 06 15:15:50 crc kubenswrapper[4763]: I1006 15:15:50.992803 4763 scope.go:117] "RemoveContainer" containerID="f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.014459 4763 scope.go:117] "RemoveContainer" containerID="ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.035580 4763 scope.go:117] "RemoveContainer" containerID="d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.066444 4763 scope.go:117] "RemoveContainer" containerID="bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.098885 4763 scope.go:117] "RemoveContainer" containerID="4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.124144 4763 scope.go:117] "RemoveContainer" containerID="1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.149792 4763 scope.go:117] "RemoveContainer" containerID="f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.185227 4763 scope.go:117] "RemoveContainer" containerID="04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.213949 4763 scope.go:117] "RemoveContainer" containerID="f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.243350 4763 scope.go:117] "RemoveContainer" containerID="45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.244366 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1\": container with ID starting with 45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1 not found: ID does not exist" containerID="45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.244410 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1"} err="failed to get container status \"45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1\": rpc error: code = NotFound desc = could not find container \"45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1\": container with ID starting with 45ddb6b64350fff7248248525f2e3bfaff54bf519b131fbf2f7aa42c596600c1 not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.244438 4763 scope.go:117] "RemoveContainer" containerID="2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.244917 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f\": container with ID starting with 2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f not found: ID does not exist" containerID="2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.244989 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f"} err="failed to get container status \"2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f\": rpc error: code = NotFound desc = could not find container \"2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f\": container with ID starting with 2874dac9f3306da3bcf1552015936f71e639feb7bf879854118c2cde7406f89f not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.245037 4763 scope.go:117] "RemoveContainer" containerID="4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.245550 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c\": container with ID starting with 4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c not found: ID does not exist" containerID="4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.245703 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c"} err="failed to get container status \"4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c\": rpc error: code = NotFound desc = could not find container \"4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c\": container with ID starting with 4c06c9484a9e090c21364bc7308c8c6fb460fd8800f077306cdd908c2fd1ae6c not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.245794 4763 scope.go:117] "RemoveContainer" containerID="d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.246709 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83\": container with ID starting with d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83 not found: ID does not exist" containerID="d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.246796 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83"} err="failed to get container status \"d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83\": rpc error: code = NotFound desc = could not find container \"d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83\": container with ID starting with d33204ab0c936c7be88fe079af54cdc1fc1077968de5b4001634c0b691484d83 not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.246834 4763 scope.go:117] "RemoveContainer" containerID="cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.247362 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191\": container with ID starting with cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191 not found: ID does not exist" containerID="cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.247503 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191"} err="failed to get container status \"cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191\": rpc error: code = NotFound desc = could not find container \"cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191\": container with ID starting with cad402ebfbcec325f158eb2620602a57d0dc90961fafd23486616b7a06999191 not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.247543 4763 scope.go:117] "RemoveContainer" containerID="9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.248189 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9\": container with ID starting with 9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9 not found: ID does not exist" containerID="9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.248284 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9"} err="failed to get container status \"9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9\": rpc error: code = NotFound desc = could not find container \"9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9\": container with ID starting with 9ad1d40411b029b255c270c727a443cad97ad3ae4f7d4257b4b24024af9c14d9 not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.248315 4763 scope.go:117] "RemoveContainer" containerID="f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.248912 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038\": container with ID starting with f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038 not found: ID does not exist" containerID="f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.248976 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038"} err="failed to get container status \"f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038\": rpc error: code = NotFound desc = could not find container \"f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038\": container with ID starting with f5751e652455cf776acd2027171295f48a53a249c8f295d17750ee4340cf6038 not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.249021 4763 scope.go:117] "RemoveContainer" containerID="ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.249539 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a\": container with ID starting with ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a not found: ID does not exist" containerID="ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.249673 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a"} err="failed to get container status \"ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a\": rpc error: code = NotFound desc = could not find container \"ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a\": container with ID starting with ec8398dc9940548d94dd6a48a4536a177bab65a692fbbf0655c55538352fc70a not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.249757 4763 scope.go:117] "RemoveContainer" containerID="d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.250149 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c\": container with ID starting with d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c not found: ID does not exist" containerID="d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.250195 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c"} err="failed to get container status \"d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c\": rpc error: code = NotFound desc = could not find container \"d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c\": container with ID starting with d4e96824a1492a6686cce16b51527529cb04cbb80650c129e88cf68b027f925c not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.250222 4763 scope.go:117] "RemoveContainer" containerID="bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.250680 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d\": container with ID starting with bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d not found: ID does not exist" containerID="bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.250808 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d"} err="failed to get container status \"bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d\": rpc error: code = NotFound desc = could not find container \"bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d\": container with ID starting with bc3fa9ad87fd31cc912ef4c42ba62ecac34fe53c1d8868df9fc25de8ab3dea9d not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.250891 4763 scope.go:117] "RemoveContainer" containerID="4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.251297 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26\": container with ID starting with 4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26 not found: ID does not exist" containerID="4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.251347 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26"} err="failed to get container status \"4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26\": rpc error: code = NotFound desc = could not find container \"4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26\": container with ID starting with 4e3d8fe308f2917a8c25791f6f3597b915028c74c46c7cbb5077463d50158a26 not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.251373 4763 scope.go:117] "RemoveContainer" containerID="1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.251894 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54\": container with ID starting with 1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54 not found: ID does not exist" containerID="1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.251996 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54"} err="failed to get container status \"1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54\": rpc error: code = NotFound desc = could not find container \"1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54\": container with ID starting with 1c42ec0a7a2f0e6211b560ffe2b7ba01afac357302bfe8ac19667334feb87e54 not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.252034 4763 scope.go:117] "RemoveContainer" containerID="f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.252481 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900\": container with ID starting with f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900 not found: ID does not exist" containerID="f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.252519 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900"} err="failed to get container status \"f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900\": rpc error: code = NotFound desc = could not find container \"f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900\": container with ID starting with f6799ba7f12471aacbd6205a218391ada0169b5b91f4c16e8b0b4fdf91e64900 not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.252539 4763 scope.go:117] "RemoveContainer" containerID="04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.252901 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315\": container with ID starting with 04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315 not found: ID does not exist" containerID="04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.252969 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315"} err="failed to get container status \"04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315\": rpc error: code = NotFound desc = could not find container \"04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315\": container with ID starting with 04de32f64ab2ceca2a4d900a6be0f9df9fdeec997659e45a7a0c7aa24a585315 not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.253015 4763 scope.go:117] "RemoveContainer" containerID="f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1" Oct 06 15:15:51 crc kubenswrapper[4763]: E1006 15:15:51.253388 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1\": container with ID starting with f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1 not found: ID does not exist" containerID="f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.253426 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1"} err="failed to get container status \"f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1\": rpc error: code = NotFound desc = could not find container \"f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1\": container with ID starting with f88910d34802138d746a9eeb191199f184d82a73b14cc1b3bbb952ddf8d9abb1 not found: ID does not exist" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.592959 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" path="/var/lib/kubelet/pods/1ed9fd34-3ac8-4420-958a-d4d41f7c83fa/volumes" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.594124 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" path="/var/lib/kubelet/pods/84d1d27d-b811-4100-9366-b71d6ae0f4a0/volumes" Oct 06 15:15:51 crc kubenswrapper[4763]: I1006 15:15:51.598353 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14df013-8cb0-4f11-b69d-a52002788320" path="/var/lib/kubelet/pods/d14df013-8cb0-4f11-b69d-a52002788320/volumes" Oct 06 15:15:56 crc kubenswrapper[4763]: E1006 15:15:56.291652 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df38d69_816c_41c3_8de5_b270104ebb23.slice/crio-9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4\": RecentStats: unable to find data in memory cache]" Oct 06 15:16:06 crc kubenswrapper[4763]: E1006 15:16:06.545531 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df38d69_816c_41c3_8de5_b270104ebb23.slice/crio-9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4\": RecentStats: unable to find data in memory cache]" Oct 06 15:16:16 crc kubenswrapper[4763]: E1006 15:16:16.758266 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df38d69_816c_41c3_8de5_b270104ebb23.slice/crio-9b5dd839637a37196e36cc96e0c71d2e565d336f0f45fed34be0829dca6deea4\": RecentStats: unable to find data in memory cache]" Oct 06 15:17:03 crc kubenswrapper[4763]: I1006 15:17:03.876981 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:17:03 crc kubenswrapper[4763]: I1006 15:17:03.877680 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:17:25 crc kubenswrapper[4763]: I1006 15:17:25.758653 4763 scope.go:117] "RemoveContainer" containerID="183e3372d23d8635d4405d3ea33596bff482fb5dc71ccbaf10e36cd781aa4b28" Oct 06 15:17:25 crc kubenswrapper[4763]: I1006 15:17:25.802120 4763 scope.go:117] "RemoveContainer" containerID="d961877ec4b14e2b7e6ef98679339178907bb759679beb038faba68e04b8d83c" Oct 06 15:17:25 crc kubenswrapper[4763]: I1006 15:17:25.851593 4763 scope.go:117] "RemoveContainer" containerID="7e445100d7b145aaaf73332a81543e81ff0be4b2cbeb11b7c92527fe9a85e535" Oct 06 15:17:25 crc kubenswrapper[4763]: I1006 15:17:25.896535 4763 scope.go:117] "RemoveContainer" containerID="f9d09efa24f76ffcf8c6cf177621756e703ca8bd1f580098a21fd80b3636e2ea" Oct 06 15:17:25 crc kubenswrapper[4763]: I1006 15:17:25.919281 4763 scope.go:117] "RemoveContainer" containerID="1e4d1491bc56ac192201b7ad02f9fb5fc81651e1d4b719423ea1e29ac06c157c" Oct 06 15:17:25 crc kubenswrapper[4763]: I1006 15:17:25.957214 4763 scope.go:117] "RemoveContainer" containerID="05bf3caedcaa68cbf017d60a64fb60ae8fb7231b8d15fab4b0d6608e04dd0284" Oct 06 15:17:25 crc kubenswrapper[4763]: I1006 15:17:25.978845 4763 scope.go:117] "RemoveContainer" containerID="fe138415dc947a471df08b86b03b53a815c137e140183219b500f293a4b6ab0b" Oct 06 15:17:26 crc kubenswrapper[4763]: I1006 15:17:26.003414 4763 scope.go:117] "RemoveContainer" containerID="e2dc15d06d499d02fb313b5e2fbbff8faaf5bdca3ae7c4532ae7f40c987017e1" Oct 06 15:17:26 crc kubenswrapper[4763]: I1006 15:17:26.030097 4763 scope.go:117] "RemoveContainer" containerID="059180dfffb2f4ec5062d7c004b4bc2124526f5d57c5fd6b90a56b685d97f34b" Oct 06 15:17:26 crc kubenswrapper[4763]: I1006 15:17:26.048303 4763 scope.go:117] "RemoveContainer" containerID="fd0ecba7a3ee2a477705590263aa7e4b257d48c0ca869f26bf98cf7458fbfbb0" Oct 06 15:17:26 crc kubenswrapper[4763]: I1006 15:17:26.082158 4763 scope.go:117] "RemoveContainer" containerID="39fd94c7915cbb68b0ee4017312fa1eba7294d2ea98ae1e2255811da30e8afb0" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.260813 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wv56k"] Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.261703 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" containerName="mysql-bootstrap" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.261761 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" containerName="mysql-bootstrap" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.261779 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovs-vswitchd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.261785 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovs-vswitchd" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.261822 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="swift-recon-cron" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.261836 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="swift-recon-cron" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.261849 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7530761-b715-4178-8d58-5e1cd54838d0" containerName="ovn-northd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.261857 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7530761-b715-4178-8d58-5e1cd54838d0" containerName="ovn-northd" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.261897 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-server" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.261904 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-server" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.261915 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="ceilometer-notification-agent" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.261923 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="ceilometer-notification-agent" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.261932 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="rsync" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.261938 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="rsync" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.261947 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" containerName="probe" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.261976 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" containerName="probe" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.261985 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server-init" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.261992 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server-init" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262002 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerName="cinder-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262009 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerName="cinder-api" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262024 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-reaper" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262030 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-reaper" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262067 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-server" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262073 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-server" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262083 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-updater" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262088 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-updater" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262095 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c465d0a4-ce55-49ff-bdd4-62585989b25b" containerName="placement-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262101 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c465d0a4-ce55-49ff-bdd4-62585989b25b" containerName="placement-log" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262111 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fad9bbe-33dc-4f1d-a156-52bbd3a69273" containerName="rabbitmq" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262151 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fad9bbe-33dc-4f1d-a156-52bbd3a69273" containerName="rabbitmq" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262158 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" containerName="glance-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262165 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" containerName="glance-log" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262173 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-replicator" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262179 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-replicator" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262194 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c67adc5-b329-4832-a9e6-711a70d0021e" containerName="nova-scheduler-scheduler" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262225 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c67adc5-b329-4832-a9e6-711a70d0021e" containerName="nova-scheduler-scheduler" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262238 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerName="barbican-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262244 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerName="barbican-api" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262255 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a939be-54f9-4483-b37c-57e6d5b04f0d" containerName="neutron-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262262 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a939be-54f9-4483-b37c-57e6d5b04f0d" containerName="neutron-api" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262267 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-auditor" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262273 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-auditor" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262280 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-auditor" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262310 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-auditor" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262320 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c465d0a4-ce55-49ff-bdd4-62585989b25b" containerName="placement-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262325 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c465d0a4-ce55-49ff-bdd4-62585989b25b" containerName="placement-api" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262333 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fad9bbe-33dc-4f1d-a156-52bbd3a69273" containerName="setup-container" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262339 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fad9bbe-33dc-4f1d-a156-52bbd3a69273" containerName="setup-container" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262349 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c33271c-af2f-43e4-adaf-9a81ef747ee5" containerName="kube-state-metrics" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262354 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c33271c-af2f-43e4-adaf-9a81ef747ee5" containerName="kube-state-metrics" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262388 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95acc4bd-d14c-4204-b20e-36085edffb73" containerName="memcached" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262396 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="95acc4bd-d14c-4204-b20e-36085edffb73" containerName="memcached" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262402 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ce3abd-750d-48db-a75f-e3a0d44e042d" containerName="keystone-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262408 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ce3abd-750d-48db-a75f-e3a0d44e042d" containerName="keystone-api" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262415 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" containerName="rabbitmq" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262421 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" containerName="rabbitmq" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262432 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262437 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262472 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerName="barbican-api-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262479 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerName="barbican-api-log" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262489 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a424ce-ef7d-4b9c-965e-b821798d3f78" containerName="glance-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262494 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a424ce-ef7d-4b9c-965e-b821798d3f78" containerName="glance-log" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262502 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a424ce-ef7d-4b9c-965e-b821798d3f78" containerName="glance-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262508 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a424ce-ef7d-4b9c-965e-b821798d3f78" containerName="glance-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262514 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb3dd44-b7c9-4653-930a-113565fccec1" containerName="nova-cell0-conductor-conductor" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262519 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb3dd44-b7c9-4653-930a-113565fccec1" containerName="nova-cell0-conductor-conductor" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262553 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2235b0e6-860e-450c-b129-f0082e1670e1" containerName="mariadb-account-delete" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262560 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2235b0e6-860e-450c-b129-f0082e1670e1" containerName="mariadb-account-delete" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262571 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-replicator" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262577 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-replicator" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262583 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-updater" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262589 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-updater" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262600 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="ceilometer-central-agent" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262652 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="ceilometer-central-agent" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262665 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" containerName="galera" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262672 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" containerName="galera" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262678 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" containerName="glance-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262685 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" containerName="glance-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262693 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" containerName="setup-container" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262699 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" containerName="setup-container" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262734 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="proxy-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262741 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="proxy-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262753 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-replicator" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262759 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-replicator" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262768 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a939be-54f9-4483-b37c-57e6d5b04f0d" containerName="neutron-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262774 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a939be-54f9-4483-b37c-57e6d5b04f0d" containerName="neutron-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262785 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerName="cinder-api-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262817 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerName="cinder-api-log" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262826 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262832 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-log" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262841 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-auditor" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262847 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-auditor" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262858 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-metadata" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262864 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-metadata" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262904 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45540131-5bd7-47c8-bab3-da9362ab3aa3" containerName="nova-cell1-conductor-conductor" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262912 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="45540131-5bd7-47c8-bab3-da9362ab3aa3" containerName="nova-cell1-conductor-conductor" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262922 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-expirer" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262930 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-expirer" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262937 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-server" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262944 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-server" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262978 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7530761-b715-4178-8d58-5e1cd54838d0" containerName="openstack-network-exporter" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.262985 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7530761-b715-4178-8d58-5e1cd54838d0" containerName="openstack-network-exporter" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.262995 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerName="nova-api-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263001 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerName="nova-api-log" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.263009 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerName="nova-api-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263015 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerName="nova-api-api" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.263026 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="sg-core" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263056 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="sg-core" Oct 06 15:17:27 crc kubenswrapper[4763]: E1006 15:17:27.263066 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" containerName="cinder-scheduler" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263073 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" containerName="cinder-scheduler" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263413 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-replicator" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263459 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-updater" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263470 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-server" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263478 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="sg-core" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263487 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a424ce-ef7d-4b9c-965e-b821798d3f78" containerName="glance-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263497 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerName="nova-api-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263506 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c83a4de-f6df-4d0e-9bd0-03cbcb877f43" containerName="rabbitmq" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263536 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2235b0e6-860e-450c-b129-f0082e1670e1" containerName="mariadb-account-delete" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263549 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263587 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a939be-54f9-4483-b37c-57e6d5b04f0d" containerName="neutron-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263634 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c67adc5-b329-4832-a9e6-711a70d0021e" containerName="nova-scheduler-scheduler" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263644 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c465d0a4-ce55-49ff-bdd4-62585989b25b" containerName="placement-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263651 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" containerName="probe" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263659 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="95acc4bd-d14c-4204-b20e-36085edffb73" containerName="memcached" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263671 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-server" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263703 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerName="cinder-api-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263715 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-expirer" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263721 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c465d0a4-ce55-49ff-bdd4-62585989b25b" containerName="placement-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263730 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerName="barbican-api-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263743 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7530761-b715-4178-8d58-5e1cd54838d0" containerName="ovn-northd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263753 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" containerName="glance-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263793 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b750fb-21cc-4a04-ba58-bddcbc2161e7" containerName="cinder-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263804 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="ceilometer-central-agent" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263814 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7909e384-b1c8-476c-801d-8b60015ccdc4" containerName="barbican-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263826 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-auditor" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263834 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b427d1-75e9-4b32-afeb-f895661ddbe1" containerName="nova-metadata-metadata" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263879 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a939be-54f9-4483-b37c-57e6d5b04f0d" containerName="neutron-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263891 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="proxy-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263901 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-replicator" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263911 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovsdb-server" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263949 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c33271c-af2f-43e4-adaf-9a81ef747ee5" containerName="kube-state-metrics" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263959 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14df013-8cb0-4f11-b69d-a52002788320" containerName="ovs-vswitchd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263966 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-server" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263979 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-reaper" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263991 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fad9bbe-33dc-4f1d-a156-52bbd3a69273" containerName="rabbitmq" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.263999 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed9fd34-3ac8-4420-958a-d4d41f7c83fa" containerName="cinder-scheduler" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264043 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="account-replicator" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264056 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-auditor" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264070 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7530761-b715-4178-8d58-5e1cd54838d0" containerName="openstack-network-exporter" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264078 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb3dd44-b7c9-4653-930a-113565fccec1" containerName="nova-cell0-conductor-conductor" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264118 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="swift-recon-cron" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264131 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd7fbde-cddf-41fe-9a6e-6b1cdba389de" containerName="galera" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264138 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08ec27f-a0b7-4146-8378-8bfb3e460e05" containerName="glance-httpd" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264144 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="78411411-8959-4af9-9396-864a5dc9f0b1" containerName="ceilometer-notification-agent" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264154 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="container-auditor" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264162 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ce3abd-750d-48db-a75f-e3a0d44e042d" containerName="keystone-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264168 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="object-updater" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264203 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bebf8d6-16bb-4dcf-afac-6a1a55e81350" containerName="nova-api-api" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264213 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d27d-b811-4100-9366-b71d6ae0f4a0" containerName="rsync" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264220 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a424ce-ef7d-4b9c-965e-b821798d3f78" containerName="glance-log" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.264230 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="45540131-5bd7-47c8-bab3-da9362ab3aa3" containerName="nova-cell1-conductor-conductor" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.265864 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.288490 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wv56k"] Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.336963 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw6nf\" (UniqueName: \"kubernetes.io/projected/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-kube-api-access-tw6nf\") pod \"community-operators-wv56k\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.337051 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-utilities\") pod \"community-operators-wv56k\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.337142 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-catalog-content\") pod \"community-operators-wv56k\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.439560 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw6nf\" (UniqueName: \"kubernetes.io/projected/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-kube-api-access-tw6nf\") pod \"community-operators-wv56k\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.439713 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-utilities\") pod \"community-operators-wv56k\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.439849 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-catalog-content\") pod \"community-operators-wv56k\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.440350 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-utilities\") pod \"community-operators-wv56k\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.440525 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-catalog-content\") pod \"community-operators-wv56k\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.462086 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw6nf\" (UniqueName: \"kubernetes.io/projected/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-kube-api-access-tw6nf\") pod \"community-operators-wv56k\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:27 crc kubenswrapper[4763]: I1006 15:17:27.611983 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:28 crc kubenswrapper[4763]: I1006 15:17:28.091042 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wv56k"] Oct 06 15:17:28 crc kubenswrapper[4763]: I1006 15:17:28.579488 4763 generic.go:334] "Generic (PLEG): container finished" podID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerID="287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d" exitCode=0 Oct 06 15:17:28 crc kubenswrapper[4763]: I1006 15:17:28.579566 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv56k" event={"ID":"4bc9148e-1b42-4c70-abe3-626fcc55b0e8","Type":"ContainerDied","Data":"287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d"} Oct 06 15:17:28 crc kubenswrapper[4763]: I1006 15:17:28.579900 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv56k" event={"ID":"4bc9148e-1b42-4c70-abe3-626fcc55b0e8","Type":"ContainerStarted","Data":"4bd0fcedf3073dde4977a5ccf212855b23762357daa72d70639878a2781117a5"} Oct 06 15:17:28 crc kubenswrapper[4763]: I1006 15:17:28.581109 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 15:17:29 crc kubenswrapper[4763]: I1006 15:17:29.591824 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv56k" event={"ID":"4bc9148e-1b42-4c70-abe3-626fcc55b0e8","Type":"ContainerStarted","Data":"48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2"} Oct 06 15:17:30 crc kubenswrapper[4763]: I1006 15:17:30.607960 4763 generic.go:334] "Generic (PLEG): container finished" podID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerID="48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2" exitCode=0 Oct 06 15:17:30 crc kubenswrapper[4763]: I1006 15:17:30.608076 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv56k" event={"ID":"4bc9148e-1b42-4c70-abe3-626fcc55b0e8","Type":"ContainerDied","Data":"48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2"} Oct 06 15:17:31 crc kubenswrapper[4763]: I1006 15:17:31.617506 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv56k" event={"ID":"4bc9148e-1b42-4c70-abe3-626fcc55b0e8","Type":"ContainerStarted","Data":"3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351"} Oct 06 15:17:31 crc kubenswrapper[4763]: I1006 15:17:31.637558 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wv56k" podStartSLOduration=2.208493701 podStartE2EDuration="4.637543105s" podCreationTimestamp="2025-10-06 15:17:27 +0000 UTC" firstStartedPulling="2025-10-06 15:17:28.58091627 +0000 UTC m=+1445.736208782" lastFinishedPulling="2025-10-06 15:17:31.009965674 +0000 UTC m=+1448.165258186" observedRunningTime="2025-10-06 15:17:31.633712227 +0000 UTC m=+1448.789004749" watchObservedRunningTime="2025-10-06 15:17:31.637543105 +0000 UTC m=+1448.792835617" Oct 06 15:17:33 crc kubenswrapper[4763]: I1006 15:17:33.877053 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:17:33 crc kubenswrapper[4763]: I1006 15:17:33.877412 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:17:37 crc kubenswrapper[4763]: I1006 15:17:37.613155 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:37 crc kubenswrapper[4763]: I1006 15:17:37.613672 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:37 crc kubenswrapper[4763]: I1006 15:17:37.693342 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:38 crc kubenswrapper[4763]: I1006 15:17:38.748374 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:38 crc kubenswrapper[4763]: I1006 15:17:38.822608 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wv56k"] Oct 06 15:17:40 crc kubenswrapper[4763]: I1006 15:17:40.701706 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wv56k" podUID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerName="registry-server" containerID="cri-o://3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351" gracePeriod=2 Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.097834 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.174523 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-catalog-content\") pod \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.174673 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw6nf\" (UniqueName: \"kubernetes.io/projected/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-kube-api-access-tw6nf\") pod \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.174828 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-utilities\") pod \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\" (UID: \"4bc9148e-1b42-4c70-abe3-626fcc55b0e8\") " Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.177303 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-utilities" (OuterVolumeSpecName: "utilities") pod "4bc9148e-1b42-4c70-abe3-626fcc55b0e8" (UID: "4bc9148e-1b42-4c70-abe3-626fcc55b0e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.181123 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-kube-api-access-tw6nf" (OuterVolumeSpecName: "kube-api-access-tw6nf") pod "4bc9148e-1b42-4c70-abe3-626fcc55b0e8" (UID: "4bc9148e-1b42-4c70-abe3-626fcc55b0e8"). InnerVolumeSpecName "kube-api-access-tw6nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.243804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bc9148e-1b42-4c70-abe3-626fcc55b0e8" (UID: "4bc9148e-1b42-4c70-abe3-626fcc55b0e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.277320 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.277378 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.277393 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw6nf\" (UniqueName: \"kubernetes.io/projected/4bc9148e-1b42-4c70-abe3-626fcc55b0e8-kube-api-access-tw6nf\") on node \"crc\" DevicePath \"\"" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.715303 4763 generic.go:334] "Generic (PLEG): container finished" podID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerID="3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351" exitCode=0 Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.715364 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv56k" event={"ID":"4bc9148e-1b42-4c70-abe3-626fcc55b0e8","Type":"ContainerDied","Data":"3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351"} Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.715408 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv56k" event={"ID":"4bc9148e-1b42-4c70-abe3-626fcc55b0e8","Type":"ContainerDied","Data":"4bd0fcedf3073dde4977a5ccf212855b23762357daa72d70639878a2781117a5"} Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.715431 4763 scope.go:117] "RemoveContainer" containerID="3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.715430 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wv56k" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.745701 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wv56k"] Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.763195 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wv56k"] Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.772315 4763 scope.go:117] "RemoveContainer" containerID="48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.795213 4763 scope.go:117] "RemoveContainer" containerID="287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.828183 4763 scope.go:117] "RemoveContainer" containerID="3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351" Oct 06 15:17:41 crc kubenswrapper[4763]: E1006 15:17:41.828636 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351\": container with ID starting with 3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351 not found: ID does not exist" containerID="3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.828678 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351"} err="failed to get container status \"3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351\": rpc error: code = NotFound desc = could not find container \"3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351\": container with ID starting with 3199ef979767d2a3460694e44acf366030d04dcb56bbfcae91e035e8ca868351 not found: ID does not exist" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.828700 4763 scope.go:117] "RemoveContainer" containerID="48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2" Oct 06 15:17:41 crc kubenswrapper[4763]: E1006 15:17:41.828948 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2\": container with ID starting with 48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2 not found: ID does not exist" containerID="48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.828979 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2"} err="failed to get container status \"48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2\": rpc error: code = NotFound desc = could not find container \"48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2\": container with ID starting with 48f28bca9d264c9cb70e9dff6f89962aaf5c90c36a25f4c58dcb49fce10b4ef2 not found: ID does not exist" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.828998 4763 scope.go:117] "RemoveContainer" containerID="287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d" Oct 06 15:17:41 crc kubenswrapper[4763]: E1006 15:17:41.829211 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d\": container with ID starting with 287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d not found: ID does not exist" containerID="287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d" Oct 06 15:17:41 crc kubenswrapper[4763]: I1006 15:17:41.829234 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d"} err="failed to get container status \"287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d\": rpc error: code = NotFound desc = could not find container \"287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d\": container with ID starting with 287e9c69f410baf5b46654345e7059e1e8eb296bed16712869d2977fba67ec8d not found: ID does not exist" Oct 06 15:17:43 crc kubenswrapper[4763]: I1006 15:17:43.591482 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" path="/var/lib/kubelet/pods/4bc9148e-1b42-4c70-abe3-626fcc55b0e8/volumes" Oct 06 15:18:03 crc kubenswrapper[4763]: I1006 15:18:03.876668 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:18:03 crc kubenswrapper[4763]: I1006 15:18:03.877233 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:18:03 crc kubenswrapper[4763]: I1006 15:18:03.877293 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:18:03 crc kubenswrapper[4763]: I1006 15:18:03.878107 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"598b0409ddd13b1cc4decd02be05f861e61f403a2ac96c7d8d43d4a7f393169e"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:18:03 crc kubenswrapper[4763]: I1006 15:18:03.878211 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://598b0409ddd13b1cc4decd02be05f861e61f403a2ac96c7d8d43d4a7f393169e" gracePeriod=600 Oct 06 15:18:04 crc kubenswrapper[4763]: I1006 15:18:04.947821 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="598b0409ddd13b1cc4decd02be05f861e61f403a2ac96c7d8d43d4a7f393169e" exitCode=0 Oct 06 15:18:04 crc kubenswrapper[4763]: I1006 15:18:04.947894 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"598b0409ddd13b1cc4decd02be05f861e61f403a2ac96c7d8d43d4a7f393169e"} Oct 06 15:18:04 crc kubenswrapper[4763]: I1006 15:18:04.948410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006"} Oct 06 15:18:04 crc kubenswrapper[4763]: I1006 15:18:04.948445 4763 scope.go:117] "RemoveContainer" containerID="2cc326b9fc4c544f26c5cf613aae0ed392475a5aae225e91805a959c1915374a" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.228263 4763 scope.go:117] "RemoveContainer" containerID="6e894010adcea5c116cbd4e38f6bdd3832ca519307d5338247f59df695912ae0" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.271006 4763 scope.go:117] "RemoveContainer" containerID="c3caa7ea290d288d2300056350ea320341cdbe1c53cd822fd83f4992c1443266" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.296205 4763 scope.go:117] "RemoveContainer" containerID="8d4dbe263870060d5d730461509c5145f0a5edac849cd22428e9070476023189" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.335065 4763 scope.go:117] "RemoveContainer" containerID="fcf6164b2d181242c9b5562ef21adf8bd1d27334b6c36d6883ad95c6af383a2e" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.380999 4763 scope.go:117] "RemoveContainer" containerID="8bcc49ab811eb212bd4375f7cdbde3ddbb5e4e78565e5c34a1a7eda5d16ff196" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.422950 4763 scope.go:117] "RemoveContainer" containerID="213a25e22f02ae7ac149710de55e90923971454fd1c545a530dba766bc0d6bb1" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.450584 4763 scope.go:117] "RemoveContainer" containerID="380638c831a780a71f75f79db475f78294719dd4f058d1f572dc7d482dfdeec2" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.480160 4763 scope.go:117] "RemoveContainer" containerID="5da2eee9cbed0720c597485e546d3ebbefe89e4dd872ff9efcb3761f1de5329d" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.508280 4763 scope.go:117] "RemoveContainer" containerID="b3e95b1adef60fbddee41bd6936cab65de9301c6fd3e6e266eded3450d295996" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.543419 4763 scope.go:117] "RemoveContainer" containerID="2353c95d5781717c2fe7a35581cf271acd024551bdd9e72ffc6a127579831157" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.569426 4763 scope.go:117] "RemoveContainer" containerID="afd96ca75e1ed3b393dc75249667fe38b163d255761c91560b0325c02a34a298" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.614048 4763 scope.go:117] "RemoveContainer" containerID="b9e961308b9e2a5ad800c19e4dd19080c85b3ac3bc689989ac8c957262ec2a37" Oct 06 15:18:26 crc kubenswrapper[4763]: I1006 15:18:26.686983 4763 scope.go:117] "RemoveContainer" containerID="f59e250ba8dce3b88a299b3d9f53c567b19f7f179bd84af3fffb86a318ad5368" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.556561 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d9wrv"] Oct 06 15:18:29 crc kubenswrapper[4763]: E1006 15:18:29.558042 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerName="extract-utilities" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.558196 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerName="extract-utilities" Oct 06 15:18:29 crc kubenswrapper[4763]: E1006 15:18:29.558232 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerName="extract-content" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.558362 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerName="extract-content" Oct 06 15:18:29 crc kubenswrapper[4763]: E1006 15:18:29.558384 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerName="registry-server" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.558393 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerName="registry-server" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.560722 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc9148e-1b42-4c70-abe3-626fcc55b0e8" containerName="registry-server" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.564653 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.596773 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d9wrv"] Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.654715 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsvbc\" (UniqueName: \"kubernetes.io/projected/6b712f91-e917-4a2d-a507-e66de3077b9a-kube-api-access-xsvbc\") pod \"certified-operators-d9wrv\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.654774 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-utilities\") pod \"certified-operators-d9wrv\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.654800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-catalog-content\") pod \"certified-operators-d9wrv\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.756524 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsvbc\" (UniqueName: \"kubernetes.io/projected/6b712f91-e917-4a2d-a507-e66de3077b9a-kube-api-access-xsvbc\") pod \"certified-operators-d9wrv\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.756609 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-utilities\") pod \"certified-operators-d9wrv\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.756650 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-catalog-content\") pod \"certified-operators-d9wrv\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.757197 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-catalog-content\") pod \"certified-operators-d9wrv\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.757649 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-utilities\") pod \"certified-operators-d9wrv\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.783486 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsvbc\" (UniqueName: \"kubernetes.io/projected/6b712f91-e917-4a2d-a507-e66de3077b9a-kube-api-access-xsvbc\") pod \"certified-operators-d9wrv\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:29 crc kubenswrapper[4763]: I1006 15:18:29.891325 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:30 crc kubenswrapper[4763]: I1006 15:18:30.391142 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d9wrv"] Oct 06 15:18:31 crc kubenswrapper[4763]: I1006 15:18:31.227572 4763 generic.go:334] "Generic (PLEG): container finished" podID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerID="5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494" exitCode=0 Oct 06 15:18:31 crc kubenswrapper[4763]: I1006 15:18:31.227673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9wrv" event={"ID":"6b712f91-e917-4a2d-a507-e66de3077b9a","Type":"ContainerDied","Data":"5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494"} Oct 06 15:18:31 crc kubenswrapper[4763]: I1006 15:18:31.228175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9wrv" event={"ID":"6b712f91-e917-4a2d-a507-e66de3077b9a","Type":"ContainerStarted","Data":"71f5693d851187cb7592fba1825cf24b0fcd65d136081c8387a932c493ba1f98"} Oct 06 15:18:33 crc kubenswrapper[4763]: I1006 15:18:33.243894 4763 generic.go:334] "Generic (PLEG): container finished" podID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerID="39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8" exitCode=0 Oct 06 15:18:33 crc kubenswrapper[4763]: I1006 15:18:33.243994 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9wrv" event={"ID":"6b712f91-e917-4a2d-a507-e66de3077b9a","Type":"ContainerDied","Data":"39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8"} Oct 06 15:18:34 crc kubenswrapper[4763]: I1006 15:18:34.261769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9wrv" event={"ID":"6b712f91-e917-4a2d-a507-e66de3077b9a","Type":"ContainerStarted","Data":"bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612"} Oct 06 15:18:34 crc kubenswrapper[4763]: I1006 15:18:34.284758 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d9wrv" podStartSLOduration=2.601646539 podStartE2EDuration="5.284738767s" podCreationTimestamp="2025-10-06 15:18:29 +0000 UTC" firstStartedPulling="2025-10-06 15:18:31.230889404 +0000 UTC m=+1508.386181916" lastFinishedPulling="2025-10-06 15:18:33.913981622 +0000 UTC m=+1511.069274144" observedRunningTime="2025-10-06 15:18:34.282307764 +0000 UTC m=+1511.437600286" watchObservedRunningTime="2025-10-06 15:18:34.284738767 +0000 UTC m=+1511.440031279" Oct 06 15:18:39 crc kubenswrapper[4763]: I1006 15:18:39.891712 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:39 crc kubenswrapper[4763]: I1006 15:18:39.893131 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:39 crc kubenswrapper[4763]: I1006 15:18:39.994452 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:40 crc kubenswrapper[4763]: I1006 15:18:40.380028 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:40 crc kubenswrapper[4763]: I1006 15:18:40.440099 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d9wrv"] Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.332557 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d9wrv" podUID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerName="registry-server" containerID="cri-o://bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612" gracePeriod=2 Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.731857 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.855507 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsvbc\" (UniqueName: \"kubernetes.io/projected/6b712f91-e917-4a2d-a507-e66de3077b9a-kube-api-access-xsvbc\") pod \"6b712f91-e917-4a2d-a507-e66de3077b9a\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.855575 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-utilities\") pod \"6b712f91-e917-4a2d-a507-e66de3077b9a\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.855859 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-catalog-content\") pod \"6b712f91-e917-4a2d-a507-e66de3077b9a\" (UID: \"6b712f91-e917-4a2d-a507-e66de3077b9a\") " Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.856390 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-utilities" (OuterVolumeSpecName: "utilities") pod "6b712f91-e917-4a2d-a507-e66de3077b9a" (UID: "6b712f91-e917-4a2d-a507-e66de3077b9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.857914 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.860437 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b712f91-e917-4a2d-a507-e66de3077b9a-kube-api-access-xsvbc" (OuterVolumeSpecName: "kube-api-access-xsvbc") pod "6b712f91-e917-4a2d-a507-e66de3077b9a" (UID: "6b712f91-e917-4a2d-a507-e66de3077b9a"). InnerVolumeSpecName "kube-api-access-xsvbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.912781 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b712f91-e917-4a2d-a507-e66de3077b9a" (UID: "6b712f91-e917-4a2d-a507-e66de3077b9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.960105 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b712f91-e917-4a2d-a507-e66de3077b9a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:18:42 crc kubenswrapper[4763]: I1006 15:18:42.960403 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsvbc\" (UniqueName: \"kubernetes.io/projected/6b712f91-e917-4a2d-a507-e66de3077b9a-kube-api-access-xsvbc\") on node \"crc\" DevicePath \"\"" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.349176 4763 generic.go:334] "Generic (PLEG): container finished" podID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerID="bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612" exitCode=0 Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.349244 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9wrv" event={"ID":"6b712f91-e917-4a2d-a507-e66de3077b9a","Type":"ContainerDied","Data":"bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612"} Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.349286 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9wrv" event={"ID":"6b712f91-e917-4a2d-a507-e66de3077b9a","Type":"ContainerDied","Data":"71f5693d851187cb7592fba1825cf24b0fcd65d136081c8387a932c493ba1f98"} Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.349321 4763 scope.go:117] "RemoveContainer" containerID="bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.349334 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9wrv" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.381699 4763 scope.go:117] "RemoveContainer" containerID="39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.411818 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d9wrv"] Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.418494 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d9wrv"] Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.436597 4763 scope.go:117] "RemoveContainer" containerID="5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.465055 4763 scope.go:117] "RemoveContainer" containerID="bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612" Oct 06 15:18:43 crc kubenswrapper[4763]: E1006 15:18:43.465764 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612\": container with ID starting with bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612 not found: ID does not exist" containerID="bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.465847 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612"} err="failed to get container status \"bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612\": rpc error: code = NotFound desc = could not find container \"bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612\": container with ID starting with bc409a0c7bfbfb4e92a3100f37824cd0ff8fc5df72ccbec8bf1b4e0373be8612 not found: ID does not exist" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.465900 4763 scope.go:117] "RemoveContainer" containerID="39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8" Oct 06 15:18:43 crc kubenswrapper[4763]: E1006 15:18:43.466549 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8\": container with ID starting with 39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8 not found: ID does not exist" containerID="39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.466607 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8"} err="failed to get container status \"39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8\": rpc error: code = NotFound desc = could not find container \"39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8\": container with ID starting with 39b69715b44532eaaf315de73bdbb556b449262a8a80b640a60fb8297ac5d7d8 not found: ID does not exist" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.466714 4763 scope.go:117] "RemoveContainer" containerID="5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494" Oct 06 15:18:43 crc kubenswrapper[4763]: E1006 15:18:43.467216 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494\": container with ID starting with 5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494 not found: ID does not exist" containerID="5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.467273 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494"} err="failed to get container status \"5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494\": rpc error: code = NotFound desc = could not find container \"5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494\": container with ID starting with 5b6c3f20a4a4b6bd9f8b8f7ea7342e731ed73e5ece2eee0cd4d1b26605a45494 not found: ID does not exist" Oct 06 15:18:43 crc kubenswrapper[4763]: I1006 15:18:43.592865 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b712f91-e917-4a2d-a507-e66de3077b9a" path="/var/lib/kubelet/pods/6b712f91-e917-4a2d-a507-e66de3077b9a/volumes" Oct 06 15:19:26 crc kubenswrapper[4763]: I1006 15:19:26.936290 4763 scope.go:117] "RemoveContainer" containerID="4793430acd2d3747e4f90ca4aa9269c1b1570b34bc980bdff5cf2808f5651841" Oct 06 15:19:26 crc kubenswrapper[4763]: I1006 15:19:26.970541 4763 scope.go:117] "RemoveContainer" containerID="60c57e4673b5def98438adfcc0581b8c0827205fea234915bea281492ad7e221" Oct 06 15:19:26 crc kubenswrapper[4763]: I1006 15:19:26.995875 4763 scope.go:117] "RemoveContainer" containerID="0cb10096907fb2a5c3d74e1bc6aad8f1891f33bcd0cf07fae66a4c4f6c4f002c" Oct 06 15:19:27 crc kubenswrapper[4763]: I1006 15:19:27.024630 4763 scope.go:117] "RemoveContainer" containerID="407915862e4e0141f6174c767f9b9f07b89723636e5be3b23e2f2c38957637a1" Oct 06 15:19:27 crc kubenswrapper[4763]: I1006 15:19:27.046671 4763 scope.go:117] "RemoveContainer" containerID="7e2562acc0fae8c8886b551ae63328fe2b596b0aa0ed9eebdb4aafd3a62ad900" Oct 06 15:19:27 crc kubenswrapper[4763]: I1006 15:19:27.077407 4763 scope.go:117] "RemoveContainer" containerID="e01c05d2c451fef7ff0f3b5fbec07d8d7fe6fdefbd56476079b9af62d60685a6" Oct 06 15:19:27 crc kubenswrapper[4763]: I1006 15:19:27.101796 4763 scope.go:117] "RemoveContainer" containerID="6a9b20147a6347915ae66b7027ee7a03bca0f91c2b7745f8b0f3b2ac537af3dd" Oct 06 15:20:27 crc kubenswrapper[4763]: I1006 15:20:27.249293 4763 scope.go:117] "RemoveContainer" containerID="ccd62700fafdf3a6f1684e5b5db825da9d723dfeda19d35346fbeaad12324951" Oct 06 15:20:27 crc kubenswrapper[4763]: I1006 15:20:27.292139 4763 scope.go:117] "RemoveContainer" containerID="3639f3cd97ad76ec5113dd306c9f03ea2aa1f8d9570485be1cca76f7d68fed2f" Oct 06 15:20:27 crc kubenswrapper[4763]: I1006 15:20:27.341405 4763 scope.go:117] "RemoveContainer" containerID="bde4aed9fd9a6c5eefae7c1cc33564849b0279bade7ce7e70e2159cd7431f7f7" Oct 06 15:20:27 crc kubenswrapper[4763]: I1006 15:20:27.382793 4763 scope.go:117] "RemoveContainer" containerID="1a19de1d681a2184e6bc1c868810837a0a80d978dbe170b9eb27d56fe034b52f" Oct 06 15:20:27 crc kubenswrapper[4763]: I1006 15:20:27.431468 4763 scope.go:117] "RemoveContainer" containerID="f120c52d33a0f9d6e29cb913ca8a9664d3a0d2f8ad98c0347d03b9ff605947bd" Oct 06 15:20:33 crc kubenswrapper[4763]: I1006 15:20:33.876676 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:20:33 crc kubenswrapper[4763]: I1006 15:20:33.877049 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.570054 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-868np"] Oct 06 15:20:43 crc kubenswrapper[4763]: E1006 15:20:43.570866 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerName="extract-content" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.570889 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerName="extract-content" Oct 06 15:20:43 crc kubenswrapper[4763]: E1006 15:20:43.570934 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerName="extract-utilities" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.570945 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerName="extract-utilities" Oct 06 15:20:43 crc kubenswrapper[4763]: E1006 15:20:43.570964 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerName="registry-server" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.570976 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerName="registry-server" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.571199 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b712f91-e917-4a2d-a507-e66de3077b9a" containerName="registry-server" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.572792 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.587479 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-868np"] Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.719116 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-utilities\") pod \"redhat-marketplace-868np\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.719206 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-catalog-content\") pod \"redhat-marketplace-868np\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.719328 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h47h\" (UniqueName: \"kubernetes.io/projected/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-kube-api-access-6h47h\") pod \"redhat-marketplace-868np\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.820650 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-catalog-content\") pod \"redhat-marketplace-868np\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.821681 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h47h\" (UniqueName: \"kubernetes.io/projected/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-kube-api-access-6h47h\") pod \"redhat-marketplace-868np\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.822378 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-utilities\") pod \"redhat-marketplace-868np\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.821535 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-catalog-content\") pod \"redhat-marketplace-868np\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.822770 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-utilities\") pod \"redhat-marketplace-868np\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.854484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h47h\" (UniqueName: \"kubernetes.io/projected/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-kube-api-access-6h47h\") pod \"redhat-marketplace-868np\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:43 crc kubenswrapper[4763]: I1006 15:20:43.904844 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.166550 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ps8j9"] Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.168377 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.178293 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ps8j9"] Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.328501 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swvnz\" (UniqueName: \"kubernetes.io/projected/4b3ab79d-826c-4355-8e49-7e122e1390ee-kube-api-access-swvnz\") pod \"redhat-operators-ps8j9\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.328553 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-utilities\") pod \"redhat-operators-ps8j9\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.328579 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-catalog-content\") pod \"redhat-operators-ps8j9\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.395096 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-868np"] Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.429793 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-utilities\") pod \"redhat-operators-ps8j9\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.429843 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-catalog-content\") pod \"redhat-operators-ps8j9\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.429934 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swvnz\" (UniqueName: \"kubernetes.io/projected/4b3ab79d-826c-4355-8e49-7e122e1390ee-kube-api-access-swvnz\") pod \"redhat-operators-ps8j9\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.430703 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-catalog-content\") pod \"redhat-operators-ps8j9\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.430722 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-utilities\") pod \"redhat-operators-ps8j9\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.431764 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868np" event={"ID":"925c3672-fe7f-4034-bcd1-1ed7d88a48cf","Type":"ContainerStarted","Data":"e8ea3ccb1ca660ad12b9fbf1c87922bd8a12553b2157a374f006f9c91addae10"} Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.453026 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swvnz\" (UniqueName: \"kubernetes.io/projected/4b3ab79d-826c-4355-8e49-7e122e1390ee-kube-api-access-swvnz\") pod \"redhat-operators-ps8j9\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.493123 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:44 crc kubenswrapper[4763]: I1006 15:20:44.738539 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ps8j9"] Oct 06 15:20:44 crc kubenswrapper[4763]: W1006 15:20:44.745438 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b3ab79d_826c_4355_8e49_7e122e1390ee.slice/crio-48aaf49aae83fdc84c63c084a71c0f84a2a1f33192382368340ff322c900d06d WatchSource:0}: Error finding container 48aaf49aae83fdc84c63c084a71c0f84a2a1f33192382368340ff322c900d06d: Status 404 returned error can't find the container with id 48aaf49aae83fdc84c63c084a71c0f84a2a1f33192382368340ff322c900d06d Oct 06 15:20:45 crc kubenswrapper[4763]: I1006 15:20:45.440379 4763 generic.go:334] "Generic (PLEG): container finished" podID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerID="cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a" exitCode=0 Oct 06 15:20:45 crc kubenswrapper[4763]: I1006 15:20:45.440472 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868np" event={"ID":"925c3672-fe7f-4034-bcd1-1ed7d88a48cf","Type":"ContainerDied","Data":"cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a"} Oct 06 15:20:45 crc kubenswrapper[4763]: I1006 15:20:45.442075 4763 generic.go:334] "Generic (PLEG): container finished" podID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerID="57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6" exitCode=0 Oct 06 15:20:45 crc kubenswrapper[4763]: I1006 15:20:45.442097 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ps8j9" event={"ID":"4b3ab79d-826c-4355-8e49-7e122e1390ee","Type":"ContainerDied","Data":"57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6"} Oct 06 15:20:45 crc kubenswrapper[4763]: I1006 15:20:45.442115 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ps8j9" event={"ID":"4b3ab79d-826c-4355-8e49-7e122e1390ee","Type":"ContainerStarted","Data":"48aaf49aae83fdc84c63c084a71c0f84a2a1f33192382368340ff322c900d06d"} Oct 06 15:20:47 crc kubenswrapper[4763]: I1006 15:20:47.460592 4763 generic.go:334] "Generic (PLEG): container finished" podID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerID="f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420" exitCode=0 Oct 06 15:20:47 crc kubenswrapper[4763]: I1006 15:20:47.460647 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868np" event={"ID":"925c3672-fe7f-4034-bcd1-1ed7d88a48cf","Type":"ContainerDied","Data":"f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420"} Oct 06 15:20:47 crc kubenswrapper[4763]: I1006 15:20:47.463090 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ps8j9" event={"ID":"4b3ab79d-826c-4355-8e49-7e122e1390ee","Type":"ContainerStarted","Data":"05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe"} Oct 06 15:20:48 crc kubenswrapper[4763]: I1006 15:20:48.475850 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868np" event={"ID":"925c3672-fe7f-4034-bcd1-1ed7d88a48cf","Type":"ContainerStarted","Data":"400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd"} Oct 06 15:20:48 crc kubenswrapper[4763]: I1006 15:20:48.480345 4763 generic.go:334] "Generic (PLEG): container finished" podID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerID="05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe" exitCode=0 Oct 06 15:20:48 crc kubenswrapper[4763]: I1006 15:20:48.480419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ps8j9" event={"ID":"4b3ab79d-826c-4355-8e49-7e122e1390ee","Type":"ContainerDied","Data":"05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe"} Oct 06 15:20:48 crc kubenswrapper[4763]: I1006 15:20:48.516116 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-868np" podStartSLOduration=2.900026319 podStartE2EDuration="5.516092098s" podCreationTimestamp="2025-10-06 15:20:43 +0000 UTC" firstStartedPulling="2025-10-06 15:20:45.442683882 +0000 UTC m=+1642.597976394" lastFinishedPulling="2025-10-06 15:20:48.058749631 +0000 UTC m=+1645.214042173" observedRunningTime="2025-10-06 15:20:48.513515037 +0000 UTC m=+1645.668807589" watchObservedRunningTime="2025-10-06 15:20:48.516092098 +0000 UTC m=+1645.671384630" Oct 06 15:20:49 crc kubenswrapper[4763]: I1006 15:20:49.493771 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ps8j9" event={"ID":"4b3ab79d-826c-4355-8e49-7e122e1390ee","Type":"ContainerStarted","Data":"cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9"} Oct 06 15:20:49 crc kubenswrapper[4763]: I1006 15:20:49.514801 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ps8j9" podStartSLOduration=2.004910992 podStartE2EDuration="5.514782524s" podCreationTimestamp="2025-10-06 15:20:44 +0000 UTC" firstStartedPulling="2025-10-06 15:20:45.444211104 +0000 UTC m=+1642.599503616" lastFinishedPulling="2025-10-06 15:20:48.954082606 +0000 UTC m=+1646.109375148" observedRunningTime="2025-10-06 15:20:49.514063805 +0000 UTC m=+1646.669356337" watchObservedRunningTime="2025-10-06 15:20:49.514782524 +0000 UTC m=+1646.670075036" Oct 06 15:20:53 crc kubenswrapper[4763]: I1006 15:20:53.905586 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:53 crc kubenswrapper[4763]: I1006 15:20:53.906259 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:53 crc kubenswrapper[4763]: I1006 15:20:53.958350 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:54 crc kubenswrapper[4763]: I1006 15:20:54.493509 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:54 crc kubenswrapper[4763]: I1006 15:20:54.493565 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:54 crc kubenswrapper[4763]: I1006 15:20:54.539852 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:54 crc kubenswrapper[4763]: I1006 15:20:54.592483 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:54 crc kubenswrapper[4763]: I1006 15:20:54.595397 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:55 crc kubenswrapper[4763]: I1006 15:20:55.956859 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-868np"] Oct 06 15:20:56 crc kubenswrapper[4763]: I1006 15:20:56.560389 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-868np" podUID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerName="registry-server" containerID="cri-o://400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd" gracePeriod=2 Oct 06 15:20:56 crc kubenswrapper[4763]: I1006 15:20:56.945407 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:56 crc kubenswrapper[4763]: I1006 15:20:56.958607 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ps8j9"] Oct 06 15:20:56 crc kubenswrapper[4763]: I1006 15:20:56.958971 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ps8j9" podUID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerName="registry-server" containerID="cri-o://cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9" gracePeriod=2 Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.116768 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h47h\" (UniqueName: \"kubernetes.io/projected/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-kube-api-access-6h47h\") pod \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.117128 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-utilities\") pod \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.117201 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-catalog-content\") pod \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\" (UID: \"925c3672-fe7f-4034-bcd1-1ed7d88a48cf\") " Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.118180 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-utilities" (OuterVolumeSpecName: "utilities") pod "925c3672-fe7f-4034-bcd1-1ed7d88a48cf" (UID: "925c3672-fe7f-4034-bcd1-1ed7d88a48cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.126980 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-kube-api-access-6h47h" (OuterVolumeSpecName: "kube-api-access-6h47h") pod "925c3672-fe7f-4034-bcd1-1ed7d88a48cf" (UID: "925c3672-fe7f-4034-bcd1-1ed7d88a48cf"). InnerVolumeSpecName "kube-api-access-6h47h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.133166 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "925c3672-fe7f-4034-bcd1-1ed7d88a48cf" (UID: "925c3672-fe7f-4034-bcd1-1ed7d88a48cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.219251 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.219289 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h47h\" (UniqueName: \"kubernetes.io/projected/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-kube-api-access-6h47h\") on node \"crc\" DevicePath \"\"" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.219303 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/925c3672-fe7f-4034-bcd1-1ed7d88a48cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.326927 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.425335 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-utilities\") pod \"4b3ab79d-826c-4355-8e49-7e122e1390ee\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.425457 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swvnz\" (UniqueName: \"kubernetes.io/projected/4b3ab79d-826c-4355-8e49-7e122e1390ee-kube-api-access-swvnz\") pod \"4b3ab79d-826c-4355-8e49-7e122e1390ee\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.425488 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-catalog-content\") pod \"4b3ab79d-826c-4355-8e49-7e122e1390ee\" (UID: \"4b3ab79d-826c-4355-8e49-7e122e1390ee\") " Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.426445 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-utilities" (OuterVolumeSpecName: "utilities") pod "4b3ab79d-826c-4355-8e49-7e122e1390ee" (UID: "4b3ab79d-826c-4355-8e49-7e122e1390ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.433801 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3ab79d-826c-4355-8e49-7e122e1390ee-kube-api-access-swvnz" (OuterVolumeSpecName: "kube-api-access-swvnz") pod "4b3ab79d-826c-4355-8e49-7e122e1390ee" (UID: "4b3ab79d-826c-4355-8e49-7e122e1390ee"). InnerVolumeSpecName "kube-api-access-swvnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.516381 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b3ab79d-826c-4355-8e49-7e122e1390ee" (UID: "4b3ab79d-826c-4355-8e49-7e122e1390ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.527233 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.527282 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swvnz\" (UniqueName: \"kubernetes.io/projected/4b3ab79d-826c-4355-8e49-7e122e1390ee-kube-api-access-swvnz\") on node \"crc\" DevicePath \"\"" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.527296 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3ab79d-826c-4355-8e49-7e122e1390ee-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.574442 4763 generic.go:334] "Generic (PLEG): container finished" podID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerID="400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd" exitCode=0 Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.574608 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868np" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.581858 4763 generic.go:334] "Generic (PLEG): container finished" podID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerID="cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9" exitCode=0 Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.581977 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ps8j9" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.593977 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868np" event={"ID":"925c3672-fe7f-4034-bcd1-1ed7d88a48cf","Type":"ContainerDied","Data":"400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd"} Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.594030 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868np" event={"ID":"925c3672-fe7f-4034-bcd1-1ed7d88a48cf","Type":"ContainerDied","Data":"e8ea3ccb1ca660ad12b9fbf1c87922bd8a12553b2157a374f006f9c91addae10"} Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.594045 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ps8j9" event={"ID":"4b3ab79d-826c-4355-8e49-7e122e1390ee","Type":"ContainerDied","Data":"cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9"} Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.594060 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ps8j9" event={"ID":"4b3ab79d-826c-4355-8e49-7e122e1390ee","Type":"ContainerDied","Data":"48aaf49aae83fdc84c63c084a71c0f84a2a1f33192382368340ff322c900d06d"} Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.594080 4763 scope.go:117] "RemoveContainer" containerID="400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.617227 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-868np"] Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.625046 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-868np"] Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.641894 4763 scope.go:117] "RemoveContainer" containerID="f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.650997 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ps8j9"] Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.660272 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ps8j9"] Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.670352 4763 scope.go:117] "RemoveContainer" containerID="cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.704328 4763 scope.go:117] "RemoveContainer" containerID="400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd" Oct 06 15:20:57 crc kubenswrapper[4763]: E1006 15:20:57.704869 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd\": container with ID starting with 400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd not found: ID does not exist" containerID="400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.704907 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd"} err="failed to get container status \"400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd\": rpc error: code = NotFound desc = could not find container \"400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd\": container with ID starting with 400a8bccfdc12633e9930f14ccc4bff333d305019beb85770f90b202112d8fbd not found: ID does not exist" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.704934 4763 scope.go:117] "RemoveContainer" containerID="f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420" Oct 06 15:20:57 crc kubenswrapper[4763]: E1006 15:20:57.705275 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420\": container with ID starting with f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420 not found: ID does not exist" containerID="f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.705322 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420"} err="failed to get container status \"f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420\": rpc error: code = NotFound desc = could not find container \"f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420\": container with ID starting with f6df2553f0888a755533c5203e8f9e77e7bd4efbfae5743b4cb150bd51ae1420 not found: ID does not exist" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.705351 4763 scope.go:117] "RemoveContainer" containerID="cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a" Oct 06 15:20:57 crc kubenswrapper[4763]: E1006 15:20:57.705732 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a\": container with ID starting with cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a not found: ID does not exist" containerID="cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.705775 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a"} err="failed to get container status \"cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a\": rpc error: code = NotFound desc = could not find container \"cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a\": container with ID starting with cb6700905ef88aaae9d1d09a86a8fdd2314a49a681e544754ebb726ccbf2397a not found: ID does not exist" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.705804 4763 scope.go:117] "RemoveContainer" containerID="cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.723913 4763 scope.go:117] "RemoveContainer" containerID="05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.747949 4763 scope.go:117] "RemoveContainer" containerID="57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.774641 4763 scope.go:117] "RemoveContainer" containerID="cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9" Oct 06 15:20:57 crc kubenswrapper[4763]: E1006 15:20:57.775522 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9\": container with ID starting with cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9 not found: ID does not exist" containerID="cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.775563 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9"} err="failed to get container status \"cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9\": rpc error: code = NotFound desc = could not find container \"cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9\": container with ID starting with cfd94ca29b4929589cc87ac37d620ab40a4e7d03af3f5b1e2c17b379038625b9 not found: ID does not exist" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.775590 4763 scope.go:117] "RemoveContainer" containerID="05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe" Oct 06 15:20:57 crc kubenswrapper[4763]: E1006 15:20:57.775905 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe\": container with ID starting with 05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe not found: ID does not exist" containerID="05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.775928 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe"} err="failed to get container status \"05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe\": rpc error: code = NotFound desc = could not find container \"05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe\": container with ID starting with 05d119a55b2e88906386cf8e3b98d90aa70b1809c80c25ef5839b2594dd5e3fe not found: ID does not exist" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.775945 4763 scope.go:117] "RemoveContainer" containerID="57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6" Oct 06 15:20:57 crc kubenswrapper[4763]: E1006 15:20:57.776376 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6\": container with ID starting with 57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6 not found: ID does not exist" containerID="57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6" Oct 06 15:20:57 crc kubenswrapper[4763]: I1006 15:20:57.776394 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6"} err="failed to get container status \"57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6\": rpc error: code = NotFound desc = could not find container \"57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6\": container with ID starting with 57b7eb484bf81d0e271e2b5bf47c3b78f8a93a3466069a6f0fc28df9065355e6 not found: ID does not exist" Oct 06 15:20:59 crc kubenswrapper[4763]: I1006 15:20:59.591069 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3ab79d-826c-4355-8e49-7e122e1390ee" path="/var/lib/kubelet/pods/4b3ab79d-826c-4355-8e49-7e122e1390ee/volumes" Oct 06 15:20:59 crc kubenswrapper[4763]: I1006 15:20:59.592573 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" path="/var/lib/kubelet/pods/925c3672-fe7f-4034-bcd1-1ed7d88a48cf/volumes" Oct 06 15:21:03 crc kubenswrapper[4763]: I1006 15:21:03.876970 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:21:03 crc kubenswrapper[4763]: I1006 15:21:03.877359 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:21:33 crc kubenswrapper[4763]: I1006 15:21:33.876547 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:21:33 crc kubenswrapper[4763]: I1006 15:21:33.877241 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:21:33 crc kubenswrapper[4763]: I1006 15:21:33.877299 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:21:33 crc kubenswrapper[4763]: I1006 15:21:33.877868 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:21:33 crc kubenswrapper[4763]: I1006 15:21:33.877932 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" gracePeriod=600 Oct 06 15:21:34 crc kubenswrapper[4763]: E1006 15:21:34.007390 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:21:34 crc kubenswrapper[4763]: I1006 15:21:34.906971 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" exitCode=0 Oct 06 15:21:34 crc kubenswrapper[4763]: I1006 15:21:34.907025 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006"} Oct 06 15:21:34 crc kubenswrapper[4763]: I1006 15:21:34.907059 4763 scope.go:117] "RemoveContainer" containerID="598b0409ddd13b1cc4decd02be05f861e61f403a2ac96c7d8d43d4a7f393169e" Oct 06 15:21:34 crc kubenswrapper[4763]: I1006 15:21:34.907833 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:21:34 crc kubenswrapper[4763]: E1006 15:21:34.908505 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:21:47 crc kubenswrapper[4763]: I1006 15:21:47.574813 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:21:47 crc kubenswrapper[4763]: E1006 15:21:47.575464 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:22:02 crc kubenswrapper[4763]: I1006 15:22:02.574653 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:22:02 crc kubenswrapper[4763]: E1006 15:22:02.575401 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:22:14 crc kubenswrapper[4763]: I1006 15:22:14.575492 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:22:14 crc kubenswrapper[4763]: E1006 15:22:14.576557 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:22:27 crc kubenswrapper[4763]: I1006 15:22:27.575268 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:22:27 crc kubenswrapper[4763]: E1006 15:22:27.577044 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:22:38 crc kubenswrapper[4763]: I1006 15:22:38.575580 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:22:38 crc kubenswrapper[4763]: E1006 15:22:38.576671 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:22:50 crc kubenswrapper[4763]: I1006 15:22:50.574950 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:22:50 crc kubenswrapper[4763]: E1006 15:22:50.575794 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:23:05 crc kubenswrapper[4763]: I1006 15:23:05.574952 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:23:05 crc kubenswrapper[4763]: E1006 15:23:05.575836 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:23:17 crc kubenswrapper[4763]: I1006 15:23:17.574594 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:23:17 crc kubenswrapper[4763]: E1006 15:23:17.575373 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:23:31 crc kubenswrapper[4763]: I1006 15:23:31.575302 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:23:31 crc kubenswrapper[4763]: E1006 15:23:31.576484 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:23:43 crc kubenswrapper[4763]: I1006 15:23:43.583210 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:23:43 crc kubenswrapper[4763]: E1006 15:23:43.584279 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:23:55 crc kubenswrapper[4763]: I1006 15:23:55.574683 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:23:55 crc kubenswrapper[4763]: E1006 15:23:55.575579 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:24:10 crc kubenswrapper[4763]: I1006 15:24:10.574882 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:24:10 crc kubenswrapper[4763]: E1006 15:24:10.575796 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:24:21 crc kubenswrapper[4763]: I1006 15:24:21.575094 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:24:21 crc kubenswrapper[4763]: E1006 15:24:21.575824 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:24:33 crc kubenswrapper[4763]: I1006 15:24:33.579912 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:24:33 crc kubenswrapper[4763]: E1006 15:24:33.580412 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:24:44 crc kubenswrapper[4763]: I1006 15:24:44.575787 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:24:44 crc kubenswrapper[4763]: E1006 15:24:44.577098 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:24:56 crc kubenswrapper[4763]: I1006 15:24:56.575278 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:24:56 crc kubenswrapper[4763]: E1006 15:24:56.576152 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:25:07 crc kubenswrapper[4763]: I1006 15:25:07.574726 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:25:07 crc kubenswrapper[4763]: E1006 15:25:07.575551 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:25:22 crc kubenswrapper[4763]: I1006 15:25:22.575251 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:25:22 crc kubenswrapper[4763]: E1006 15:25:22.577201 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:25:35 crc kubenswrapper[4763]: I1006 15:25:35.575116 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:25:35 crc kubenswrapper[4763]: E1006 15:25:35.575923 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:25:48 crc kubenswrapper[4763]: I1006 15:25:48.576109 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:25:48 crc kubenswrapper[4763]: E1006 15:25:48.577170 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:26:00 crc kubenswrapper[4763]: I1006 15:26:00.575956 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:26:00 crc kubenswrapper[4763]: E1006 15:26:00.576906 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:26:11 crc kubenswrapper[4763]: I1006 15:26:11.575905 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:26:11 crc kubenswrapper[4763]: E1006 15:26:11.576685 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:26:25 crc kubenswrapper[4763]: I1006 15:26:25.575507 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:26:25 crc kubenswrapper[4763]: E1006 15:26:25.576438 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:26:40 crc kubenswrapper[4763]: I1006 15:26:40.574539 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:26:41 crc kubenswrapper[4763]: I1006 15:26:41.585674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"53b1d1c8a18ee3e72bcb955bbfcf19a41aab400299c0d9537c649947bbc5aeb4"} Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.943425 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5phh7"] Oct 06 15:28:03 crc kubenswrapper[4763]: E1006 15:28:03.944295 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerName="registry-server" Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.944310 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerName="registry-server" Oct 06 15:28:03 crc kubenswrapper[4763]: E1006 15:28:03.944320 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerName="extract-utilities" Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.944327 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerName="extract-utilities" Oct 06 15:28:03 crc kubenswrapper[4763]: E1006 15:28:03.944337 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerName="extract-content" Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.944344 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerName="extract-content" Oct 06 15:28:03 crc kubenswrapper[4763]: E1006 15:28:03.944351 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerName="extract-utilities" Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.944357 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerName="extract-utilities" Oct 06 15:28:03 crc kubenswrapper[4763]: E1006 15:28:03.944369 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerName="registry-server" Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.944375 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerName="registry-server" Oct 06 15:28:03 crc kubenswrapper[4763]: E1006 15:28:03.944390 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerName="extract-content" Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.944395 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerName="extract-content" Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.944514 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3ab79d-826c-4355-8e49-7e122e1390ee" containerName="registry-server" Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.944528 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="925c3672-fe7f-4034-bcd1-1ed7d88a48cf" containerName="registry-server" Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.945598 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:03 crc kubenswrapper[4763]: I1006 15:28:03.961738 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5phh7"] Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.077875 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-utilities\") pod \"community-operators-5phh7\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.077964 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktv8\" (UniqueName: \"kubernetes.io/projected/164089a2-5649-4f38-8249-61436f715258-kube-api-access-mktv8\") pod \"community-operators-5phh7\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.077997 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-catalog-content\") pod \"community-operators-5phh7\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.179514 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-utilities\") pod \"community-operators-5phh7\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.179581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mktv8\" (UniqueName: \"kubernetes.io/projected/164089a2-5649-4f38-8249-61436f715258-kube-api-access-mktv8\") pod \"community-operators-5phh7\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.179606 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-catalog-content\") pod \"community-operators-5phh7\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.180094 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-catalog-content\") pod \"community-operators-5phh7\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.180718 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-utilities\") pod \"community-operators-5phh7\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.199520 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktv8\" (UniqueName: \"kubernetes.io/projected/164089a2-5649-4f38-8249-61436f715258-kube-api-access-mktv8\") pod \"community-operators-5phh7\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.268358 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:04 crc kubenswrapper[4763]: I1006 15:28:04.794918 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5phh7"] Oct 06 15:28:05 crc kubenswrapper[4763]: I1006 15:28:05.334216 4763 generic.go:334] "Generic (PLEG): container finished" podID="164089a2-5649-4f38-8249-61436f715258" containerID="d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02" exitCode=0 Oct 06 15:28:05 crc kubenswrapper[4763]: I1006 15:28:05.334425 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5phh7" event={"ID":"164089a2-5649-4f38-8249-61436f715258","Type":"ContainerDied","Data":"d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02"} Oct 06 15:28:05 crc kubenswrapper[4763]: I1006 15:28:05.334550 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5phh7" event={"ID":"164089a2-5649-4f38-8249-61436f715258","Type":"ContainerStarted","Data":"6b14ef6cb073131301451a7d46763bedd82ab5c91a06c8a18b6f4d5879cd9668"} Oct 06 15:28:05 crc kubenswrapper[4763]: I1006 15:28:05.340293 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 15:28:06 crc kubenswrapper[4763]: I1006 15:28:06.344508 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5phh7" event={"ID":"164089a2-5649-4f38-8249-61436f715258","Type":"ContainerStarted","Data":"a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8"} Oct 06 15:28:07 crc kubenswrapper[4763]: I1006 15:28:07.354315 4763 generic.go:334] "Generic (PLEG): container finished" podID="164089a2-5649-4f38-8249-61436f715258" containerID="a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8" exitCode=0 Oct 06 15:28:07 crc kubenswrapper[4763]: I1006 15:28:07.354413 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5phh7" event={"ID":"164089a2-5649-4f38-8249-61436f715258","Type":"ContainerDied","Data":"a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8"} Oct 06 15:28:08 crc kubenswrapper[4763]: I1006 15:28:08.366412 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5phh7" event={"ID":"164089a2-5649-4f38-8249-61436f715258","Type":"ContainerStarted","Data":"5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850"} Oct 06 15:28:14 crc kubenswrapper[4763]: I1006 15:28:14.269514 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:14 crc kubenswrapper[4763]: I1006 15:28:14.271011 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:14 crc kubenswrapper[4763]: I1006 15:28:14.331901 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:14 crc kubenswrapper[4763]: I1006 15:28:14.363930 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5phh7" podStartSLOduration=8.893468654 podStartE2EDuration="11.363912365s" podCreationTimestamp="2025-10-06 15:28:03 +0000 UTC" firstStartedPulling="2025-10-06 15:28:05.339830928 +0000 UTC m=+2082.495123480" lastFinishedPulling="2025-10-06 15:28:07.810274669 +0000 UTC m=+2084.965567191" observedRunningTime="2025-10-06 15:28:08.387487193 +0000 UTC m=+2085.542779705" watchObservedRunningTime="2025-10-06 15:28:14.363912365 +0000 UTC m=+2091.519204877" Oct 06 15:28:14 crc kubenswrapper[4763]: I1006 15:28:14.483447 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:14 crc kubenswrapper[4763]: I1006 15:28:14.573210 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5phh7"] Oct 06 15:28:16 crc kubenswrapper[4763]: I1006 15:28:16.432565 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5phh7" podUID="164089a2-5649-4f38-8249-61436f715258" containerName="registry-server" containerID="cri-o://5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850" gracePeriod=2 Oct 06 15:28:16 crc kubenswrapper[4763]: I1006 15:28:16.872283 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:16 crc kubenswrapper[4763]: I1006 15:28:16.982388 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-utilities\") pod \"164089a2-5649-4f38-8249-61436f715258\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " Oct 06 15:28:16 crc kubenswrapper[4763]: I1006 15:28:16.982484 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mktv8\" (UniqueName: \"kubernetes.io/projected/164089a2-5649-4f38-8249-61436f715258-kube-api-access-mktv8\") pod \"164089a2-5649-4f38-8249-61436f715258\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " Oct 06 15:28:16 crc kubenswrapper[4763]: I1006 15:28:16.982810 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-catalog-content\") pod \"164089a2-5649-4f38-8249-61436f715258\" (UID: \"164089a2-5649-4f38-8249-61436f715258\") " Oct 06 15:28:16 crc kubenswrapper[4763]: I1006 15:28:16.983369 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-utilities" (OuterVolumeSpecName: "utilities") pod "164089a2-5649-4f38-8249-61436f715258" (UID: "164089a2-5649-4f38-8249-61436f715258"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:28:16 crc kubenswrapper[4763]: I1006 15:28:16.989376 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164089a2-5649-4f38-8249-61436f715258-kube-api-access-mktv8" (OuterVolumeSpecName: "kube-api-access-mktv8") pod "164089a2-5649-4f38-8249-61436f715258" (UID: "164089a2-5649-4f38-8249-61436f715258"). InnerVolumeSpecName "kube-api-access-mktv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.037593 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "164089a2-5649-4f38-8249-61436f715258" (UID: "164089a2-5649-4f38-8249-61436f715258"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.084044 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.084076 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164089a2-5649-4f38-8249-61436f715258-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.084090 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mktv8\" (UniqueName: \"kubernetes.io/projected/164089a2-5649-4f38-8249-61436f715258-kube-api-access-mktv8\") on node \"crc\" DevicePath \"\"" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.445345 4763 generic.go:334] "Generic (PLEG): container finished" podID="164089a2-5649-4f38-8249-61436f715258" containerID="5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850" exitCode=0 Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.445460 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5phh7" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.446332 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5phh7" event={"ID":"164089a2-5649-4f38-8249-61436f715258","Type":"ContainerDied","Data":"5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850"} Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.446472 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5phh7" event={"ID":"164089a2-5649-4f38-8249-61436f715258","Type":"ContainerDied","Data":"6b14ef6cb073131301451a7d46763bedd82ab5c91a06c8a18b6f4d5879cd9668"} Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.446575 4763 scope.go:117] "RemoveContainer" containerID="5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.470507 4763 scope.go:117] "RemoveContainer" containerID="a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.497736 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5phh7"] Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.503468 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5phh7"] Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.514636 4763 scope.go:117] "RemoveContainer" containerID="d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.544357 4763 scope.go:117] "RemoveContainer" containerID="5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850" Oct 06 15:28:17 crc kubenswrapper[4763]: E1006 15:28:17.545047 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850\": container with ID starting with 5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850 not found: ID does not exist" containerID="5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.545084 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850"} err="failed to get container status \"5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850\": rpc error: code = NotFound desc = could not find container \"5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850\": container with ID starting with 5c74a2fcc15784af47da37b85ede8f653441351874a797013dc9adbfe317a850 not found: ID does not exist" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.545110 4763 scope.go:117] "RemoveContainer" containerID="a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8" Oct 06 15:28:17 crc kubenswrapper[4763]: E1006 15:28:17.545544 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8\": container with ID starting with a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8 not found: ID does not exist" containerID="a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.545578 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8"} err="failed to get container status \"a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8\": rpc error: code = NotFound desc = could not find container \"a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8\": container with ID starting with a11fe0da105c0666e8fcdc3cae068ef8441ec045b4c390fd13d98444dddbd4e8 not found: ID does not exist" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.545595 4763 scope.go:117] "RemoveContainer" containerID="d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02" Oct 06 15:28:17 crc kubenswrapper[4763]: E1006 15:28:17.545881 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02\": container with ID starting with d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02 not found: ID does not exist" containerID="d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.545910 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02"} err="failed to get container status \"d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02\": rpc error: code = NotFound desc = could not find container \"d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02\": container with ID starting with d02e4da9aacd85b804209d18508ad49da7d2530f9337da805ee3dc9db70afd02 not found: ID does not exist" Oct 06 15:28:17 crc kubenswrapper[4763]: I1006 15:28:17.583825 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="164089a2-5649-4f38-8249-61436f715258" path="/var/lib/kubelet/pods/164089a2-5649-4f38-8249-61436f715258/volumes" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.220320 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jfbx6"] Oct 06 15:29:02 crc kubenswrapper[4763]: E1006 15:29:02.222482 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164089a2-5649-4f38-8249-61436f715258" containerName="extract-utilities" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.222544 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="164089a2-5649-4f38-8249-61436f715258" containerName="extract-utilities" Oct 06 15:29:02 crc kubenswrapper[4763]: E1006 15:29:02.222593 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164089a2-5649-4f38-8249-61436f715258" containerName="extract-content" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.222647 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="164089a2-5649-4f38-8249-61436f715258" containerName="extract-content" Oct 06 15:29:02 crc kubenswrapper[4763]: E1006 15:29:02.222693 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164089a2-5649-4f38-8249-61436f715258" containerName="registry-server" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.222710 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="164089a2-5649-4f38-8249-61436f715258" containerName="registry-server" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.223039 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="164089a2-5649-4f38-8249-61436f715258" containerName="registry-server" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.225393 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.239413 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfbx6"] Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.367228 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dntqq\" (UniqueName: \"kubernetes.io/projected/cb126528-c878-431b-9653-2ec09fd9ae81-kube-api-access-dntqq\") pod \"certified-operators-jfbx6\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.367419 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-utilities\") pod \"certified-operators-jfbx6\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.367562 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-catalog-content\") pod \"certified-operators-jfbx6\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.468872 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dntqq\" (UniqueName: \"kubernetes.io/projected/cb126528-c878-431b-9653-2ec09fd9ae81-kube-api-access-dntqq\") pod \"certified-operators-jfbx6\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.468967 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-utilities\") pod \"certified-operators-jfbx6\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.469007 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-catalog-content\") pod \"certified-operators-jfbx6\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.469442 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-catalog-content\") pod \"certified-operators-jfbx6\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.469723 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-utilities\") pod \"certified-operators-jfbx6\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.495671 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dntqq\" (UniqueName: \"kubernetes.io/projected/cb126528-c878-431b-9653-2ec09fd9ae81-kube-api-access-dntqq\") pod \"certified-operators-jfbx6\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:02 crc kubenswrapper[4763]: I1006 15:29:02.562831 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:03 crc kubenswrapper[4763]: I1006 15:29:03.091834 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfbx6"] Oct 06 15:29:03 crc kubenswrapper[4763]: I1006 15:29:03.835005 4763 generic.go:334] "Generic (PLEG): container finished" podID="cb126528-c878-431b-9653-2ec09fd9ae81" containerID="e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a" exitCode=0 Oct 06 15:29:03 crc kubenswrapper[4763]: I1006 15:29:03.835084 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfbx6" event={"ID":"cb126528-c878-431b-9653-2ec09fd9ae81","Type":"ContainerDied","Data":"e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a"} Oct 06 15:29:03 crc kubenswrapper[4763]: I1006 15:29:03.835455 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfbx6" event={"ID":"cb126528-c878-431b-9653-2ec09fd9ae81","Type":"ContainerStarted","Data":"71f2386534507d34d436513c02d01fb47b8d4ccc3dc3f85b8058dfedafbe0b9f"} Oct 06 15:29:03 crc kubenswrapper[4763]: I1006 15:29:03.876638 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:29:03 crc kubenswrapper[4763]: I1006 15:29:03.876710 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:29:04 crc kubenswrapper[4763]: I1006 15:29:04.857211 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfbx6" event={"ID":"cb126528-c878-431b-9653-2ec09fd9ae81","Type":"ContainerStarted","Data":"111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0"} Oct 06 15:29:05 crc kubenswrapper[4763]: I1006 15:29:05.866197 4763 generic.go:334] "Generic (PLEG): container finished" podID="cb126528-c878-431b-9653-2ec09fd9ae81" containerID="111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0" exitCode=0 Oct 06 15:29:05 crc kubenswrapper[4763]: I1006 15:29:05.866244 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfbx6" event={"ID":"cb126528-c878-431b-9653-2ec09fd9ae81","Type":"ContainerDied","Data":"111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0"} Oct 06 15:29:06 crc kubenswrapper[4763]: I1006 15:29:06.880082 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfbx6" event={"ID":"cb126528-c878-431b-9653-2ec09fd9ae81","Type":"ContainerStarted","Data":"5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4"} Oct 06 15:29:06 crc kubenswrapper[4763]: I1006 15:29:06.912081 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jfbx6" podStartSLOduration=2.388351522 podStartE2EDuration="4.912063339s" podCreationTimestamp="2025-10-06 15:29:02 +0000 UTC" firstStartedPulling="2025-10-06 15:29:03.836890982 +0000 UTC m=+2140.992183524" lastFinishedPulling="2025-10-06 15:29:06.360602819 +0000 UTC m=+2143.515895341" observedRunningTime="2025-10-06 15:29:06.907927336 +0000 UTC m=+2144.063219848" watchObservedRunningTime="2025-10-06 15:29:06.912063339 +0000 UTC m=+2144.067355861" Oct 06 15:29:12 crc kubenswrapper[4763]: I1006 15:29:12.563141 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:12 crc kubenswrapper[4763]: I1006 15:29:12.563433 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:12 crc kubenswrapper[4763]: I1006 15:29:12.620191 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:12 crc kubenswrapper[4763]: I1006 15:29:12.964253 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:13 crc kubenswrapper[4763]: I1006 15:29:13.001291 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfbx6"] Oct 06 15:29:14 crc kubenswrapper[4763]: I1006 15:29:14.945016 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jfbx6" podUID="cb126528-c878-431b-9653-2ec09fd9ae81" containerName="registry-server" containerID="cri-o://5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4" gracePeriod=2 Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.380190 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.553026 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-catalog-content\") pod \"cb126528-c878-431b-9653-2ec09fd9ae81\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.553098 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-utilities\") pod \"cb126528-c878-431b-9653-2ec09fd9ae81\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.553260 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dntqq\" (UniqueName: \"kubernetes.io/projected/cb126528-c878-431b-9653-2ec09fd9ae81-kube-api-access-dntqq\") pod \"cb126528-c878-431b-9653-2ec09fd9ae81\" (UID: \"cb126528-c878-431b-9653-2ec09fd9ae81\") " Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.554460 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-utilities" (OuterVolumeSpecName: "utilities") pod "cb126528-c878-431b-9653-2ec09fd9ae81" (UID: "cb126528-c878-431b-9653-2ec09fd9ae81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.557855 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb126528-c878-431b-9653-2ec09fd9ae81-kube-api-access-dntqq" (OuterVolumeSpecName: "kube-api-access-dntqq") pod "cb126528-c878-431b-9653-2ec09fd9ae81" (UID: "cb126528-c878-431b-9653-2ec09fd9ae81"). InnerVolumeSpecName "kube-api-access-dntqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.656099 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dntqq\" (UniqueName: \"kubernetes.io/projected/cb126528-c878-431b-9653-2ec09fd9ae81-kube-api-access-dntqq\") on node \"crc\" DevicePath \"\"" Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.656166 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.802387 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb126528-c878-431b-9653-2ec09fd9ae81" (UID: "cb126528-c878-431b-9653-2ec09fd9ae81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.860008 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb126528-c878-431b-9653-2ec09fd9ae81-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.960805 4763 generic.go:334] "Generic (PLEG): container finished" podID="cb126528-c878-431b-9653-2ec09fd9ae81" containerID="5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4" exitCode=0 Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.960852 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfbx6" event={"ID":"cb126528-c878-431b-9653-2ec09fd9ae81","Type":"ContainerDied","Data":"5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4"} Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.960864 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfbx6" Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.960878 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfbx6" event={"ID":"cb126528-c878-431b-9653-2ec09fd9ae81","Type":"ContainerDied","Data":"71f2386534507d34d436513c02d01fb47b8d4ccc3dc3f85b8058dfedafbe0b9f"} Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.960894 4763 scope.go:117] "RemoveContainer" containerID="5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4" Oct 06 15:29:15 crc kubenswrapper[4763]: I1006 15:29:15.992243 4763 scope.go:117] "RemoveContainer" containerID="111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0" Oct 06 15:29:16 crc kubenswrapper[4763]: I1006 15:29:16.010840 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfbx6"] Oct 06 15:29:16 crc kubenswrapper[4763]: I1006 15:29:16.015343 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jfbx6"] Oct 06 15:29:16 crc kubenswrapper[4763]: I1006 15:29:16.029786 4763 scope.go:117] "RemoveContainer" containerID="e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a" Oct 06 15:29:16 crc kubenswrapper[4763]: I1006 15:29:16.059466 4763 scope.go:117] "RemoveContainer" containerID="5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4" Oct 06 15:29:16 crc kubenswrapper[4763]: E1006 15:29:16.060203 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4\": container with ID starting with 5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4 not found: ID does not exist" containerID="5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4" Oct 06 15:29:16 crc kubenswrapper[4763]: I1006 15:29:16.060253 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4"} err="failed to get container status \"5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4\": rpc error: code = NotFound desc = could not find container \"5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4\": container with ID starting with 5868a80283c80d8aa61c0244817c34ac6bdbef5ecd3afffeb7c7ce204ace9ff4 not found: ID does not exist" Oct 06 15:29:16 crc kubenswrapper[4763]: I1006 15:29:16.060286 4763 scope.go:117] "RemoveContainer" containerID="111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0" Oct 06 15:29:16 crc kubenswrapper[4763]: E1006 15:29:16.061117 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0\": container with ID starting with 111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0 not found: ID does not exist" containerID="111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0" Oct 06 15:29:16 crc kubenswrapper[4763]: I1006 15:29:16.061184 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0"} err="failed to get container status \"111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0\": rpc error: code = NotFound desc = could not find container \"111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0\": container with ID starting with 111f527426d4eb558970ba52c8eafe45bbbfc6183f61203392e8e8afd6d5fda0 not found: ID does not exist" Oct 06 15:29:16 crc kubenswrapper[4763]: I1006 15:29:16.061234 4763 scope.go:117] "RemoveContainer" containerID="e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a" Oct 06 15:29:16 crc kubenswrapper[4763]: E1006 15:29:16.062372 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a\": container with ID starting with e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a not found: ID does not exist" containerID="e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a" Oct 06 15:29:16 crc kubenswrapper[4763]: I1006 15:29:16.062401 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a"} err="failed to get container status \"e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a\": rpc error: code = NotFound desc = could not find container \"e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a\": container with ID starting with e5a92671a2ea47a376f232c69de51d36388ba5250f97b1d9a815f23e963a6b6a not found: ID does not exist" Oct 06 15:29:17 crc kubenswrapper[4763]: I1006 15:29:17.591445 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb126528-c878-431b-9653-2ec09fd9ae81" path="/var/lib/kubelet/pods/cb126528-c878-431b-9653-2ec09fd9ae81/volumes" Oct 06 15:29:33 crc kubenswrapper[4763]: I1006 15:29:33.876509 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:29:33 crc kubenswrapper[4763]: I1006 15:29:33.877234 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.170443 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc"] Oct 06 15:30:00 crc kubenswrapper[4763]: E1006 15:30:00.171410 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb126528-c878-431b-9653-2ec09fd9ae81" containerName="extract-content" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.171430 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb126528-c878-431b-9653-2ec09fd9ae81" containerName="extract-content" Oct 06 15:30:00 crc kubenswrapper[4763]: E1006 15:30:00.171457 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb126528-c878-431b-9653-2ec09fd9ae81" containerName="extract-utilities" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.171468 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb126528-c878-431b-9653-2ec09fd9ae81" containerName="extract-utilities" Oct 06 15:30:00 crc kubenswrapper[4763]: E1006 15:30:00.171497 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb126528-c878-431b-9653-2ec09fd9ae81" containerName="registry-server" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.171506 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb126528-c878-431b-9653-2ec09fd9ae81" containerName="registry-server" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.171726 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb126528-c878-431b-9653-2ec09fd9ae81" containerName="registry-server" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.172378 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.175006 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.175403 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.186157 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc"] Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.243195 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxbrs\" (UniqueName: \"kubernetes.io/projected/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-kube-api-access-qxbrs\") pod \"collect-profiles-29329410-v45wc\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.243269 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-config-volume\") pod \"collect-profiles-29329410-v45wc\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.243296 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-secret-volume\") pod \"collect-profiles-29329410-v45wc\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.345475 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxbrs\" (UniqueName: \"kubernetes.io/projected/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-kube-api-access-qxbrs\") pod \"collect-profiles-29329410-v45wc\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.345576 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-config-volume\") pod \"collect-profiles-29329410-v45wc\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.345670 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-secret-volume\") pod \"collect-profiles-29329410-v45wc\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.347275 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-config-volume\") pod \"collect-profiles-29329410-v45wc\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.352267 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-secret-volume\") pod \"collect-profiles-29329410-v45wc\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.360916 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxbrs\" (UniqueName: \"kubernetes.io/projected/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-kube-api-access-qxbrs\") pod \"collect-profiles-29329410-v45wc\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.495789 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:00 crc kubenswrapper[4763]: I1006 15:30:00.931656 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc"] Oct 06 15:30:01 crc kubenswrapper[4763]: I1006 15:30:01.366282 4763 generic.go:334] "Generic (PLEG): container finished" podID="9c609b5b-2ad2-4145-a3e9-1fedbde830d8" containerID="614d09c275e5c70b5c2676c37afc41016169eb9769d22d49017bcd07ed1fe108" exitCode=0 Oct 06 15:30:01 crc kubenswrapper[4763]: I1006 15:30:01.366329 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" event={"ID":"9c609b5b-2ad2-4145-a3e9-1fedbde830d8","Type":"ContainerDied","Data":"614d09c275e5c70b5c2676c37afc41016169eb9769d22d49017bcd07ed1fe108"} Oct 06 15:30:01 crc kubenswrapper[4763]: I1006 15:30:01.366689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" event={"ID":"9c609b5b-2ad2-4145-a3e9-1fedbde830d8","Type":"ContainerStarted","Data":"1a362272ce94462c8810fbed71b2eca1513321a79e326356fae75b33be40a8c2"} Oct 06 15:30:02 crc kubenswrapper[4763]: I1006 15:30:02.634751 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:02 crc kubenswrapper[4763]: I1006 15:30:02.677214 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-secret-volume\") pod \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " Oct 06 15:30:02 crc kubenswrapper[4763]: I1006 15:30:02.677279 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxbrs\" (UniqueName: \"kubernetes.io/projected/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-kube-api-access-qxbrs\") pod \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " Oct 06 15:30:02 crc kubenswrapper[4763]: I1006 15:30:02.677437 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-config-volume\") pod \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\" (UID: \"9c609b5b-2ad2-4145-a3e9-1fedbde830d8\") " Oct 06 15:30:02 crc kubenswrapper[4763]: I1006 15:30:02.679304 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-config-volume" (OuterVolumeSpecName: "config-volume") pod "9c609b5b-2ad2-4145-a3e9-1fedbde830d8" (UID: "9c609b5b-2ad2-4145-a3e9-1fedbde830d8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:30:02 crc kubenswrapper[4763]: I1006 15:30:02.684092 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-kube-api-access-qxbrs" (OuterVolumeSpecName: "kube-api-access-qxbrs") pod "9c609b5b-2ad2-4145-a3e9-1fedbde830d8" (UID: "9c609b5b-2ad2-4145-a3e9-1fedbde830d8"). InnerVolumeSpecName "kube-api-access-qxbrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:30:02 crc kubenswrapper[4763]: I1006 15:30:02.688764 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9c609b5b-2ad2-4145-a3e9-1fedbde830d8" (UID: "9c609b5b-2ad2-4145-a3e9-1fedbde830d8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:30:02 crc kubenswrapper[4763]: I1006 15:30:02.778560 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 15:30:02 crc kubenswrapper[4763]: I1006 15:30:02.778602 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxbrs\" (UniqueName: \"kubernetes.io/projected/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-kube-api-access-qxbrs\") on node \"crc\" DevicePath \"\"" Oct 06 15:30:02 crc kubenswrapper[4763]: I1006 15:30:02.778626 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c609b5b-2ad2-4145-a3e9-1fedbde830d8-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 15:30:03 crc kubenswrapper[4763]: I1006 15:30:03.381818 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" event={"ID":"9c609b5b-2ad2-4145-a3e9-1fedbde830d8","Type":"ContainerDied","Data":"1a362272ce94462c8810fbed71b2eca1513321a79e326356fae75b33be40a8c2"} Oct 06 15:30:03 crc kubenswrapper[4763]: I1006 15:30:03.381865 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a362272ce94462c8810fbed71b2eca1513321a79e326356fae75b33be40a8c2" Oct 06 15:30:03 crc kubenswrapper[4763]: I1006 15:30:03.381874 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc" Oct 06 15:30:03 crc kubenswrapper[4763]: I1006 15:30:03.711991 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5"] Oct 06 15:30:03 crc kubenswrapper[4763]: I1006 15:30:03.717179 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329365-nptf5"] Oct 06 15:30:03 crc kubenswrapper[4763]: I1006 15:30:03.876799 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:30:03 crc kubenswrapper[4763]: I1006 15:30:03.876853 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:30:03 crc kubenswrapper[4763]: I1006 15:30:03.876893 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:30:03 crc kubenswrapper[4763]: I1006 15:30:03.877453 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53b1d1c8a18ee3e72bcb955bbfcf19a41aab400299c0d9537c649947bbc5aeb4"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:30:03 crc kubenswrapper[4763]: I1006 15:30:03.877500 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://53b1d1c8a18ee3e72bcb955bbfcf19a41aab400299c0d9537c649947bbc5aeb4" gracePeriod=600 Oct 06 15:30:04 crc kubenswrapper[4763]: I1006 15:30:04.392529 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="53b1d1c8a18ee3e72bcb955bbfcf19a41aab400299c0d9537c649947bbc5aeb4" exitCode=0 Oct 06 15:30:04 crc kubenswrapper[4763]: I1006 15:30:04.392604 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"53b1d1c8a18ee3e72bcb955bbfcf19a41aab400299c0d9537c649947bbc5aeb4"} Oct 06 15:30:04 crc kubenswrapper[4763]: I1006 15:30:04.393276 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf"} Oct 06 15:30:04 crc kubenswrapper[4763]: I1006 15:30:04.393331 4763 scope.go:117] "RemoveContainer" containerID="b4272000643684bff0ee7ed2c01bdafd32e3ec384a72827532dd2a240b160006" Oct 06 15:30:05 crc kubenswrapper[4763]: I1006 15:30:05.591277 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b243bd7-367c-41e7-9101-981ed6d10a13" path="/var/lib/kubelet/pods/6b243bd7-367c-41e7-9101-981ed6d10a13/volumes" Oct 06 15:30:27 crc kubenswrapper[4763]: I1006 15:30:27.756845 4763 scope.go:117] "RemoveContainer" containerID="f8e75e84c9eadf077b260ee87240d74c4238eb9a17823964c806038b34446ae8" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.756354 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m255t"] Oct 06 15:31:58 crc kubenswrapper[4763]: E1006 15:31:58.757504 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c609b5b-2ad2-4145-a3e9-1fedbde830d8" containerName="collect-profiles" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.757524 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c609b5b-2ad2-4145-a3e9-1fedbde830d8" containerName="collect-profiles" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.757804 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c609b5b-2ad2-4145-a3e9-1fedbde830d8" containerName="collect-profiles" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.759560 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.766120 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m255t"] Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.867997 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-catalog-content\") pod \"redhat-marketplace-m255t\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.868317 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkmrv\" (UniqueName: \"kubernetes.io/projected/a223def1-b2e7-4a9f-b988-e11b767320e8-kube-api-access-qkmrv\") pod \"redhat-marketplace-m255t\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.868502 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-utilities\") pod \"redhat-marketplace-m255t\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.970040 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-catalog-content\") pod \"redhat-marketplace-m255t\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.970089 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmrv\" (UniqueName: \"kubernetes.io/projected/a223def1-b2e7-4a9f-b988-e11b767320e8-kube-api-access-qkmrv\") pod \"redhat-marketplace-m255t\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.970122 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-utilities\") pod \"redhat-marketplace-m255t\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.970657 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-utilities\") pod \"redhat-marketplace-m255t\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.970706 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-catalog-content\") pod \"redhat-marketplace-m255t\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:58 crc kubenswrapper[4763]: I1006 15:31:58.993558 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkmrv\" (UniqueName: \"kubernetes.io/projected/a223def1-b2e7-4a9f-b988-e11b767320e8-kube-api-access-qkmrv\") pod \"redhat-marketplace-m255t\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:59 crc kubenswrapper[4763]: I1006 15:31:59.081485 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:31:59 crc kubenswrapper[4763]: I1006 15:31:59.323653 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m255t"] Oct 06 15:31:59 crc kubenswrapper[4763]: I1006 15:31:59.399764 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m255t" event={"ID":"a223def1-b2e7-4a9f-b988-e11b767320e8","Type":"ContainerStarted","Data":"5dc0b3b99b3cbaef28f323d70a6ffd641d5f08898d8ee044ecf30c3b6777f032"} Oct 06 15:32:00 crc kubenswrapper[4763]: I1006 15:32:00.407823 4763 generic.go:334] "Generic (PLEG): container finished" podID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerID="63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c" exitCode=0 Oct 06 15:32:00 crc kubenswrapper[4763]: I1006 15:32:00.407879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m255t" event={"ID":"a223def1-b2e7-4a9f-b988-e11b767320e8","Type":"ContainerDied","Data":"63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c"} Oct 06 15:32:02 crc kubenswrapper[4763]: I1006 15:32:02.428092 4763 generic.go:334] "Generic (PLEG): container finished" podID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerID="7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1" exitCode=0 Oct 06 15:32:02 crc kubenswrapper[4763]: I1006 15:32:02.428147 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m255t" event={"ID":"a223def1-b2e7-4a9f-b988-e11b767320e8","Type":"ContainerDied","Data":"7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1"} Oct 06 15:32:03 crc kubenswrapper[4763]: I1006 15:32:03.439698 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m255t" event={"ID":"a223def1-b2e7-4a9f-b988-e11b767320e8","Type":"ContainerStarted","Data":"e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e"} Oct 06 15:32:03 crc kubenswrapper[4763]: I1006 15:32:03.459579 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m255t" podStartSLOduration=2.746652574 podStartE2EDuration="5.45955104s" podCreationTimestamp="2025-10-06 15:31:58 +0000 UTC" firstStartedPulling="2025-10-06 15:32:00.4098856 +0000 UTC m=+2317.565178112" lastFinishedPulling="2025-10-06 15:32:03.122784066 +0000 UTC m=+2320.278076578" observedRunningTime="2025-10-06 15:32:03.456132296 +0000 UTC m=+2320.611424808" watchObservedRunningTime="2025-10-06 15:32:03.45955104 +0000 UTC m=+2320.614843552" Oct 06 15:32:09 crc kubenswrapper[4763]: I1006 15:32:09.082807 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:32:09 crc kubenswrapper[4763]: I1006 15:32:09.083517 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:32:09 crc kubenswrapper[4763]: I1006 15:32:09.160671 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:32:09 crc kubenswrapper[4763]: I1006 15:32:09.570700 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:32:09 crc kubenswrapper[4763]: I1006 15:32:09.630948 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m255t"] Oct 06 15:32:11 crc kubenswrapper[4763]: I1006 15:32:11.508815 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m255t" podUID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerName="registry-server" containerID="cri-o://e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e" gracePeriod=2 Oct 06 15:32:11 crc kubenswrapper[4763]: I1006 15:32:11.956333 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.157770 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkmrv\" (UniqueName: \"kubernetes.io/projected/a223def1-b2e7-4a9f-b988-e11b767320e8-kube-api-access-qkmrv\") pod \"a223def1-b2e7-4a9f-b988-e11b767320e8\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.157816 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-catalog-content\") pod \"a223def1-b2e7-4a9f-b988-e11b767320e8\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.157982 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-utilities\") pod \"a223def1-b2e7-4a9f-b988-e11b767320e8\" (UID: \"a223def1-b2e7-4a9f-b988-e11b767320e8\") " Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.161203 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-utilities" (OuterVolumeSpecName: "utilities") pod "a223def1-b2e7-4a9f-b988-e11b767320e8" (UID: "a223def1-b2e7-4a9f-b988-e11b767320e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.162743 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a223def1-b2e7-4a9f-b988-e11b767320e8-kube-api-access-qkmrv" (OuterVolumeSpecName: "kube-api-access-qkmrv") pod "a223def1-b2e7-4a9f-b988-e11b767320e8" (UID: "a223def1-b2e7-4a9f-b988-e11b767320e8"). InnerVolumeSpecName "kube-api-access-qkmrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.170743 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a223def1-b2e7-4a9f-b988-e11b767320e8" (UID: "a223def1-b2e7-4a9f-b988-e11b767320e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.260485 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.260796 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkmrv\" (UniqueName: \"kubernetes.io/projected/a223def1-b2e7-4a9f-b988-e11b767320e8-kube-api-access-qkmrv\") on node \"crc\" DevicePath \"\"" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.260810 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a223def1-b2e7-4a9f-b988-e11b767320e8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.519030 4763 generic.go:334] "Generic (PLEG): container finished" podID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerID="e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e" exitCode=0 Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.519096 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m255t" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.519108 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m255t" event={"ID":"a223def1-b2e7-4a9f-b988-e11b767320e8","Type":"ContainerDied","Data":"e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e"} Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.520683 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m255t" event={"ID":"a223def1-b2e7-4a9f-b988-e11b767320e8","Type":"ContainerDied","Data":"5dc0b3b99b3cbaef28f323d70a6ffd641d5f08898d8ee044ecf30c3b6777f032"} Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.520712 4763 scope.go:117] "RemoveContainer" containerID="e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.544600 4763 scope.go:117] "RemoveContainer" containerID="7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.566186 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m255t"] Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.566856 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m255t"] Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.585181 4763 scope.go:117] "RemoveContainer" containerID="63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.607394 4763 scope.go:117] "RemoveContainer" containerID="e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e" Oct 06 15:32:12 crc kubenswrapper[4763]: E1006 15:32:12.607874 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e\": container with ID starting with e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e not found: ID does not exist" containerID="e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.607906 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e"} err="failed to get container status \"e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e\": rpc error: code = NotFound desc = could not find container \"e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e\": container with ID starting with e3aa23e65806b98b492fafdfb60055a45c76453be3e89b1388dc44e8e5f6fc5e not found: ID does not exist" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.607929 4763 scope.go:117] "RemoveContainer" containerID="7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1" Oct 06 15:32:12 crc kubenswrapper[4763]: E1006 15:32:12.608155 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1\": container with ID starting with 7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1 not found: ID does not exist" containerID="7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.608184 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1"} err="failed to get container status \"7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1\": rpc error: code = NotFound desc = could not find container \"7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1\": container with ID starting with 7044bd805b95f87ddc5f80ad934f81e9204e97ca63bd562f2e3b9ed1c30b47b1 not found: ID does not exist" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.608200 4763 scope.go:117] "RemoveContainer" containerID="63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c" Oct 06 15:32:12 crc kubenswrapper[4763]: E1006 15:32:12.608529 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c\": container with ID starting with 63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c not found: ID does not exist" containerID="63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c" Oct 06 15:32:12 crc kubenswrapper[4763]: I1006 15:32:12.608559 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c"} err="failed to get container status \"63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c\": rpc error: code = NotFound desc = could not find container \"63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c\": container with ID starting with 63286230dad51a67dad7b4cf16509bdca164b6776f952aa5c21d5b2e6dbd4c2c not found: ID does not exist" Oct 06 15:32:13 crc kubenswrapper[4763]: I1006 15:32:13.583263 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a223def1-b2e7-4a9f-b988-e11b767320e8" path="/var/lib/kubelet/pods/a223def1-b2e7-4a9f-b988-e11b767320e8/volumes" Oct 06 15:32:33 crc kubenswrapper[4763]: I1006 15:32:33.876969 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:32:33 crc kubenswrapper[4763]: I1006 15:32:33.877721 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:33:03 crc kubenswrapper[4763]: I1006 15:33:03.876340 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:33:03 crc kubenswrapper[4763]: I1006 15:33:03.877039 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.305717 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2lggz"] Oct 06 15:33:05 crc kubenswrapper[4763]: E1006 15:33:05.306081 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerName="extract-utilities" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.306094 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerName="extract-utilities" Oct 06 15:33:05 crc kubenswrapper[4763]: E1006 15:33:05.306103 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerName="registry-server" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.306109 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerName="registry-server" Oct 06 15:33:05 crc kubenswrapper[4763]: E1006 15:33:05.306126 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerName="extract-content" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.306132 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerName="extract-content" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.306307 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a223def1-b2e7-4a9f-b988-e11b767320e8" containerName="registry-server" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.307302 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.311651 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lggz"] Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.405477 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgddn\" (UniqueName: \"kubernetes.io/projected/759049a9-3f17-431b-9834-8afe3e78a69d-kube-api-access-jgddn\") pod \"redhat-operators-2lggz\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.405579 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-utilities\") pod \"redhat-operators-2lggz\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.405647 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-catalog-content\") pod \"redhat-operators-2lggz\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.507085 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-catalog-content\") pod \"redhat-operators-2lggz\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.507152 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgddn\" (UniqueName: \"kubernetes.io/projected/759049a9-3f17-431b-9834-8afe3e78a69d-kube-api-access-jgddn\") pod \"redhat-operators-2lggz\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.507236 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-utilities\") pod \"redhat-operators-2lggz\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.507572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-catalog-content\") pod \"redhat-operators-2lggz\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.507709 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-utilities\") pod \"redhat-operators-2lggz\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.538376 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgddn\" (UniqueName: \"kubernetes.io/projected/759049a9-3f17-431b-9834-8afe3e78a69d-kube-api-access-jgddn\") pod \"redhat-operators-2lggz\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:05 crc kubenswrapper[4763]: I1006 15:33:05.632342 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:06 crc kubenswrapper[4763]: I1006 15:33:06.050217 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lggz"] Oct 06 15:33:07 crc kubenswrapper[4763]: I1006 15:33:07.031294 4763 generic.go:334] "Generic (PLEG): container finished" podID="759049a9-3f17-431b-9834-8afe3e78a69d" containerID="e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389" exitCode=0 Oct 06 15:33:07 crc kubenswrapper[4763]: I1006 15:33:07.031353 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lggz" event={"ID":"759049a9-3f17-431b-9834-8afe3e78a69d","Type":"ContainerDied","Data":"e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389"} Oct 06 15:33:07 crc kubenswrapper[4763]: I1006 15:33:07.031385 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lggz" event={"ID":"759049a9-3f17-431b-9834-8afe3e78a69d","Type":"ContainerStarted","Data":"0c6ee2a672631ae97f50a882547df71f4c2929424291d24d8d06b973944dbf31"} Oct 06 15:33:07 crc kubenswrapper[4763]: I1006 15:33:07.035006 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 15:33:08 crc kubenswrapper[4763]: I1006 15:33:08.039524 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lggz" event={"ID":"759049a9-3f17-431b-9834-8afe3e78a69d","Type":"ContainerStarted","Data":"052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde"} Oct 06 15:33:09 crc kubenswrapper[4763]: I1006 15:33:09.053006 4763 generic.go:334] "Generic (PLEG): container finished" podID="759049a9-3f17-431b-9834-8afe3e78a69d" containerID="052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde" exitCode=0 Oct 06 15:33:09 crc kubenswrapper[4763]: I1006 15:33:09.053100 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lggz" event={"ID":"759049a9-3f17-431b-9834-8afe3e78a69d","Type":"ContainerDied","Data":"052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde"} Oct 06 15:33:10 crc kubenswrapper[4763]: I1006 15:33:10.063298 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lggz" event={"ID":"759049a9-3f17-431b-9834-8afe3e78a69d","Type":"ContainerStarted","Data":"6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88"} Oct 06 15:33:10 crc kubenswrapper[4763]: I1006 15:33:10.085244 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2lggz" podStartSLOduration=2.644884602 podStartE2EDuration="5.085223452s" podCreationTimestamp="2025-10-06 15:33:05 +0000 UTC" firstStartedPulling="2025-10-06 15:33:07.034811734 +0000 UTC m=+2384.190104246" lastFinishedPulling="2025-10-06 15:33:09.475150594 +0000 UTC m=+2386.630443096" observedRunningTime="2025-10-06 15:33:10.080800247 +0000 UTC m=+2387.236092759" watchObservedRunningTime="2025-10-06 15:33:10.085223452 +0000 UTC m=+2387.240515974" Oct 06 15:33:15 crc kubenswrapper[4763]: I1006 15:33:15.633315 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:15 crc kubenswrapper[4763]: I1006 15:33:15.635148 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:15 crc kubenswrapper[4763]: I1006 15:33:15.673074 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:16 crc kubenswrapper[4763]: I1006 15:33:16.151249 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:16 crc kubenswrapper[4763]: I1006 15:33:16.196402 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lggz"] Oct 06 15:33:18 crc kubenswrapper[4763]: I1006 15:33:18.122857 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2lggz" podUID="759049a9-3f17-431b-9834-8afe3e78a69d" containerName="registry-server" containerID="cri-o://6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88" gracePeriod=2 Oct 06 15:33:18 crc kubenswrapper[4763]: I1006 15:33:18.557004 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:18 crc kubenswrapper[4763]: I1006 15:33:18.586941 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-utilities\") pod \"759049a9-3f17-431b-9834-8afe3e78a69d\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " Oct 06 15:33:18 crc kubenswrapper[4763]: I1006 15:33:18.587051 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgddn\" (UniqueName: \"kubernetes.io/projected/759049a9-3f17-431b-9834-8afe3e78a69d-kube-api-access-jgddn\") pod \"759049a9-3f17-431b-9834-8afe3e78a69d\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " Oct 06 15:33:18 crc kubenswrapper[4763]: I1006 15:33:18.587236 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-catalog-content\") pod \"759049a9-3f17-431b-9834-8afe3e78a69d\" (UID: \"759049a9-3f17-431b-9834-8afe3e78a69d\") " Oct 06 15:33:18 crc kubenswrapper[4763]: I1006 15:33:18.587979 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-utilities" (OuterVolumeSpecName: "utilities") pod "759049a9-3f17-431b-9834-8afe3e78a69d" (UID: "759049a9-3f17-431b-9834-8afe3e78a69d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:33:18 crc kubenswrapper[4763]: I1006 15:33:18.588202 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:33:18 crc kubenswrapper[4763]: I1006 15:33:18.593795 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/759049a9-3f17-431b-9834-8afe3e78a69d-kube-api-access-jgddn" (OuterVolumeSpecName: "kube-api-access-jgddn") pod "759049a9-3f17-431b-9834-8afe3e78a69d" (UID: "759049a9-3f17-431b-9834-8afe3e78a69d"). InnerVolumeSpecName "kube-api-access-jgddn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:33:18 crc kubenswrapper[4763]: I1006 15:33:18.689317 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgddn\" (UniqueName: \"kubernetes.io/projected/759049a9-3f17-431b-9834-8afe3e78a69d-kube-api-access-jgddn\") on node \"crc\" DevicePath \"\"" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.136745 4763 generic.go:334] "Generic (PLEG): container finished" podID="759049a9-3f17-431b-9834-8afe3e78a69d" containerID="6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88" exitCode=0 Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.136830 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lggz" event={"ID":"759049a9-3f17-431b-9834-8afe3e78a69d","Type":"ContainerDied","Data":"6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88"} Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.136871 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lggz" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.136908 4763 scope.go:117] "RemoveContainer" containerID="6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.136885 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lggz" event={"ID":"759049a9-3f17-431b-9834-8afe3e78a69d","Type":"ContainerDied","Data":"0c6ee2a672631ae97f50a882547df71f4c2929424291d24d8d06b973944dbf31"} Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.170537 4763 scope.go:117] "RemoveContainer" containerID="052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.197694 4763 scope.go:117] "RemoveContainer" containerID="e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.216787 4763 scope.go:117] "RemoveContainer" containerID="6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88" Oct 06 15:33:19 crc kubenswrapper[4763]: E1006 15:33:19.217331 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88\": container with ID starting with 6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88 not found: ID does not exist" containerID="6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.217367 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88"} err="failed to get container status \"6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88\": rpc error: code = NotFound desc = could not find container \"6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88\": container with ID starting with 6baf125727f7a92136cae2064e8b977d5ffb0d0a5f1e824d405bd666351c0f88 not found: ID does not exist" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.217392 4763 scope.go:117] "RemoveContainer" containerID="052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde" Oct 06 15:33:19 crc kubenswrapper[4763]: E1006 15:33:19.217769 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde\": container with ID starting with 052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde not found: ID does not exist" containerID="052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.217797 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde"} err="failed to get container status \"052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde\": rpc error: code = NotFound desc = could not find container \"052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde\": container with ID starting with 052c7f609b5ad6816267ec532b7f9b5ebe12723c50bc6d422f5d1f6dde999fde not found: ID does not exist" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.217821 4763 scope.go:117] "RemoveContainer" containerID="e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389" Oct 06 15:33:19 crc kubenswrapper[4763]: E1006 15:33:19.218146 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389\": container with ID starting with e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389 not found: ID does not exist" containerID="e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.218170 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389"} err="failed to get container status \"e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389\": rpc error: code = NotFound desc = could not find container \"e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389\": container with ID starting with e6882aed835c102c69a513a624b36337bf4667ed077ac259b5902f28388c6389 not found: ID does not exist" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.586865 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "759049a9-3f17-431b-9834-8afe3e78a69d" (UID: "759049a9-3f17-431b-9834-8afe3e78a69d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.603306 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759049a9-3f17-431b-9834-8afe3e78a69d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.757932 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lggz"] Oct 06 15:33:19 crc kubenswrapper[4763]: I1006 15:33:19.764224 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2lggz"] Oct 06 15:33:21 crc kubenswrapper[4763]: I1006 15:33:21.595747 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="759049a9-3f17-431b-9834-8afe3e78a69d" path="/var/lib/kubelet/pods/759049a9-3f17-431b-9834-8afe3e78a69d/volumes" Oct 06 15:33:33 crc kubenswrapper[4763]: I1006 15:33:33.876747 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:33:33 crc kubenswrapper[4763]: I1006 15:33:33.877299 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:33:33 crc kubenswrapper[4763]: I1006 15:33:33.877355 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:33:33 crc kubenswrapper[4763]: I1006 15:33:33.878205 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:33:33 crc kubenswrapper[4763]: I1006 15:33:33.878267 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" gracePeriod=600 Oct 06 15:33:34 crc kubenswrapper[4763]: E1006 15:33:34.000746 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:33:34 crc kubenswrapper[4763]: I1006 15:33:34.262674 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" exitCode=0 Oct 06 15:33:34 crc kubenswrapper[4763]: I1006 15:33:34.262707 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf"} Oct 06 15:33:34 crc kubenswrapper[4763]: I1006 15:33:34.262771 4763 scope.go:117] "RemoveContainer" containerID="53b1d1c8a18ee3e72bcb955bbfcf19a41aab400299c0d9537c649947bbc5aeb4" Oct 06 15:33:34 crc kubenswrapper[4763]: I1006 15:33:34.263584 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:33:34 crc kubenswrapper[4763]: E1006 15:33:34.264124 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:33:47 crc kubenswrapper[4763]: I1006 15:33:47.575466 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:33:47 crc kubenswrapper[4763]: E1006 15:33:47.576189 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:34:01 crc kubenswrapper[4763]: I1006 15:34:01.574813 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:34:01 crc kubenswrapper[4763]: E1006 15:34:01.575848 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:34:14 crc kubenswrapper[4763]: I1006 15:34:14.575443 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:34:14 crc kubenswrapper[4763]: E1006 15:34:14.576303 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:34:26 crc kubenswrapper[4763]: I1006 15:34:26.575381 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:34:26 crc kubenswrapper[4763]: E1006 15:34:26.576769 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:34:39 crc kubenswrapper[4763]: I1006 15:34:39.574919 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:34:39 crc kubenswrapper[4763]: E1006 15:34:39.576285 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:34:52 crc kubenswrapper[4763]: I1006 15:34:52.574858 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:34:52 crc kubenswrapper[4763]: E1006 15:34:52.575587 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:35:03 crc kubenswrapper[4763]: I1006 15:35:03.583670 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:35:03 crc kubenswrapper[4763]: E1006 15:35:03.584585 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:35:15 crc kubenswrapper[4763]: I1006 15:35:15.575433 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:35:15 crc kubenswrapper[4763]: E1006 15:35:15.576178 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:35:26 crc kubenswrapper[4763]: I1006 15:35:26.575786 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:35:26 crc kubenswrapper[4763]: E1006 15:35:26.576806 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:35:38 crc kubenswrapper[4763]: I1006 15:35:38.575262 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:35:38 crc kubenswrapper[4763]: E1006 15:35:38.576354 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:35:50 crc kubenswrapper[4763]: I1006 15:35:50.574445 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:35:50 crc kubenswrapper[4763]: E1006 15:35:50.575168 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:36:03 crc kubenswrapper[4763]: I1006 15:36:03.578842 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:36:03 crc kubenswrapper[4763]: E1006 15:36:03.579473 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:36:14 crc kubenswrapper[4763]: I1006 15:36:14.575026 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:36:14 crc kubenswrapper[4763]: E1006 15:36:14.576683 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:36:27 crc kubenswrapper[4763]: I1006 15:36:27.575331 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:36:27 crc kubenswrapper[4763]: E1006 15:36:27.577235 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:36:42 crc kubenswrapper[4763]: I1006 15:36:42.574951 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:36:42 crc kubenswrapper[4763]: E1006 15:36:42.575687 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:36:56 crc kubenswrapper[4763]: I1006 15:36:56.575340 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:36:56 crc kubenswrapper[4763]: E1006 15:36:56.576336 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:37:08 crc kubenswrapper[4763]: I1006 15:37:08.575404 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:37:08 crc kubenswrapper[4763]: E1006 15:37:08.576245 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:37:21 crc kubenswrapper[4763]: I1006 15:37:21.575828 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:37:21 crc kubenswrapper[4763]: E1006 15:37:21.578995 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:37:33 crc kubenswrapper[4763]: I1006 15:37:33.586502 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:37:33 crc kubenswrapper[4763]: E1006 15:37:33.587765 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:37:44 crc kubenswrapper[4763]: I1006 15:37:44.575965 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:37:44 crc kubenswrapper[4763]: E1006 15:37:44.577159 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:37:56 crc kubenswrapper[4763]: I1006 15:37:56.575032 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:37:56 crc kubenswrapper[4763]: E1006 15:37:56.575801 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:38:08 crc kubenswrapper[4763]: I1006 15:38:08.575919 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:38:08 crc kubenswrapper[4763]: E1006 15:38:08.577157 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:38:19 crc kubenswrapper[4763]: I1006 15:38:19.577001 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:38:19 crc kubenswrapper[4763]: E1006 15:38:19.578036 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:38:31 crc kubenswrapper[4763]: I1006 15:38:31.574545 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:38:31 crc kubenswrapper[4763]: E1006 15:38:31.575341 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:38:43 crc kubenswrapper[4763]: I1006 15:38:43.583441 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:38:43 crc kubenswrapper[4763]: I1006 15:38:43.867239 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"c31a41ce68728830fa7359c54d15315e66f49d457c4e9bc1cdb4e6db0c9ba3c3"} Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.105825 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vskxn"] Oct 06 15:40:19 crc kubenswrapper[4763]: E1006 15:40:19.106856 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759049a9-3f17-431b-9834-8afe3e78a69d" containerName="extract-content" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.106873 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="759049a9-3f17-431b-9834-8afe3e78a69d" containerName="extract-content" Oct 06 15:40:19 crc kubenswrapper[4763]: E1006 15:40:19.106885 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759049a9-3f17-431b-9834-8afe3e78a69d" containerName="registry-server" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.106894 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="759049a9-3f17-431b-9834-8afe3e78a69d" containerName="registry-server" Oct 06 15:40:19 crc kubenswrapper[4763]: E1006 15:40:19.106915 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759049a9-3f17-431b-9834-8afe3e78a69d" containerName="extract-utilities" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.106927 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="759049a9-3f17-431b-9834-8afe3e78a69d" containerName="extract-utilities" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.107150 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="759049a9-3f17-431b-9834-8afe3e78a69d" containerName="registry-server" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.108731 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.125113 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vskxn"] Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.317102 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-catalog-content\") pod \"certified-operators-vskxn\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.317205 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72flz\" (UniqueName: \"kubernetes.io/projected/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-kube-api-access-72flz\") pod \"certified-operators-vskxn\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.317320 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-utilities\") pod \"certified-operators-vskxn\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.418332 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72flz\" (UniqueName: \"kubernetes.io/projected/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-kube-api-access-72flz\") pod \"certified-operators-vskxn\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.418399 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-utilities\") pod \"certified-operators-vskxn\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.418552 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-catalog-content\") pod \"certified-operators-vskxn\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.419052 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-utilities\") pod \"certified-operators-vskxn\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.419080 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-catalog-content\") pod \"certified-operators-vskxn\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.450389 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72flz\" (UniqueName: \"kubernetes.io/projected/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-kube-api-access-72flz\") pod \"certified-operators-vskxn\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.527794 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:19 crc kubenswrapper[4763]: I1006 15:40:19.994006 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vskxn"] Oct 06 15:40:20 crc kubenswrapper[4763]: I1006 15:40:20.691161 4763 generic.go:334] "Generic (PLEG): container finished" podID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerID="33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb" exitCode=0 Oct 06 15:40:20 crc kubenswrapper[4763]: I1006 15:40:20.691209 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vskxn" event={"ID":"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf","Type":"ContainerDied","Data":"33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb"} Oct 06 15:40:20 crc kubenswrapper[4763]: I1006 15:40:20.691236 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vskxn" event={"ID":"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf","Type":"ContainerStarted","Data":"4307b4e24eeeee378992641bdfa98eb89413651b1af03fb7e7bd6d00c7bd5fa0"} Oct 06 15:40:20 crc kubenswrapper[4763]: I1006 15:40:20.692737 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 15:40:22 crc kubenswrapper[4763]: I1006 15:40:22.711223 4763 generic.go:334] "Generic (PLEG): container finished" podID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerID="d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756" exitCode=0 Oct 06 15:40:22 crc kubenswrapper[4763]: I1006 15:40:22.711315 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vskxn" event={"ID":"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf","Type":"ContainerDied","Data":"d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756"} Oct 06 15:40:23 crc kubenswrapper[4763]: I1006 15:40:23.722564 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vskxn" event={"ID":"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf","Type":"ContainerStarted","Data":"16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690"} Oct 06 15:40:23 crc kubenswrapper[4763]: I1006 15:40:23.743251 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vskxn" podStartSLOduration=2.155045219 podStartE2EDuration="4.743233488s" podCreationTimestamp="2025-10-06 15:40:19 +0000 UTC" firstStartedPulling="2025-10-06 15:40:20.692497085 +0000 UTC m=+2817.847789597" lastFinishedPulling="2025-10-06 15:40:23.280685334 +0000 UTC m=+2820.435977866" observedRunningTime="2025-10-06 15:40:23.740566076 +0000 UTC m=+2820.895858588" watchObservedRunningTime="2025-10-06 15:40:23.743233488 +0000 UTC m=+2820.898525990" Oct 06 15:40:29 crc kubenswrapper[4763]: I1006 15:40:29.528017 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:29 crc kubenswrapper[4763]: I1006 15:40:29.528751 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:29 crc kubenswrapper[4763]: I1006 15:40:29.592534 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:29 crc kubenswrapper[4763]: I1006 15:40:29.836796 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:29 crc kubenswrapper[4763]: I1006 15:40:29.884743 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vskxn"] Oct 06 15:40:31 crc kubenswrapper[4763]: I1006 15:40:31.791650 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vskxn" podUID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerName="registry-server" containerID="cri-o://16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690" gracePeriod=2 Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.248327 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.325030 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72flz\" (UniqueName: \"kubernetes.io/projected/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-kube-api-access-72flz\") pod \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.325404 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-utilities\") pod \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.325422 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-catalog-content\") pod \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\" (UID: \"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf\") " Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.327022 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-utilities" (OuterVolumeSpecName: "utilities") pod "1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" (UID: "1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.334740 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-kube-api-access-72flz" (OuterVolumeSpecName: "kube-api-access-72flz") pod "1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" (UID: "1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf"). InnerVolumeSpecName "kube-api-access-72flz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.387821 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" (UID: "1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.427156 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.427196 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.427214 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72flz\" (UniqueName: \"kubernetes.io/projected/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf-kube-api-access-72flz\") on node \"crc\" DevicePath \"\"" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.801972 4763 generic.go:334] "Generic (PLEG): container finished" podID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerID="16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690" exitCode=0 Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.802020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vskxn" event={"ID":"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf","Type":"ContainerDied","Data":"16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690"} Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.802057 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vskxn" event={"ID":"1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf","Type":"ContainerDied","Data":"4307b4e24eeeee378992641bdfa98eb89413651b1af03fb7e7bd6d00c7bd5fa0"} Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.802082 4763 scope.go:117] "RemoveContainer" containerID="16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.802115 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vskxn" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.827922 4763 scope.go:117] "RemoveContainer" containerID="d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.861389 4763 scope.go:117] "RemoveContainer" containerID="33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.867706 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vskxn"] Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.876576 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vskxn"] Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.900413 4763 scope.go:117] "RemoveContainer" containerID="16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690" Oct 06 15:40:32 crc kubenswrapper[4763]: E1006 15:40:32.901015 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690\": container with ID starting with 16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690 not found: ID does not exist" containerID="16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.901054 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690"} err="failed to get container status \"16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690\": rpc error: code = NotFound desc = could not find container \"16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690\": container with ID starting with 16fa90bc7c0b98c4734fdc4c46c98febeae813a25be4a9b75b69b0010bb89690 not found: ID does not exist" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.901079 4763 scope.go:117] "RemoveContainer" containerID="d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756" Oct 06 15:40:32 crc kubenswrapper[4763]: E1006 15:40:32.901441 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756\": container with ID starting with d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756 not found: ID does not exist" containerID="d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.901484 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756"} err="failed to get container status \"d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756\": rpc error: code = NotFound desc = could not find container \"d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756\": container with ID starting with d6679ad8678d8d9bee64a76f4ed82eafc171996e11ac88664fc297ac7fcaf756 not found: ID does not exist" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.901507 4763 scope.go:117] "RemoveContainer" containerID="33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb" Oct 06 15:40:32 crc kubenswrapper[4763]: E1006 15:40:32.901911 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb\": container with ID starting with 33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb not found: ID does not exist" containerID="33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb" Oct 06 15:40:32 crc kubenswrapper[4763]: I1006 15:40:32.901937 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb"} err="failed to get container status \"33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb\": rpc error: code = NotFound desc = could not find container \"33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb\": container with ID starting with 33e0eef671cfd69fd2d533aea1ca131ea757cd13281124b6b3b18cb7710075cb not found: ID does not exist" Oct 06 15:40:33 crc kubenswrapper[4763]: I1006 15:40:33.591823 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" path="/var/lib/kubelet/pods/1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf/volumes" Oct 06 15:41:03 crc kubenswrapper[4763]: I1006 15:41:03.877207 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:41:03 crc kubenswrapper[4763]: I1006 15:41:03.878026 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:41:33 crc kubenswrapper[4763]: I1006 15:41:33.877370 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:41:33 crc kubenswrapper[4763]: I1006 15:41:33.878078 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:42:03 crc kubenswrapper[4763]: I1006 15:42:03.877243 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:42:03 crc kubenswrapper[4763]: I1006 15:42:03.877946 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:42:03 crc kubenswrapper[4763]: I1006 15:42:03.878010 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:42:03 crc kubenswrapper[4763]: I1006 15:42:03.878906 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c31a41ce68728830fa7359c54d15315e66f49d457c4e9bc1cdb4e6db0c9ba3c3"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:42:03 crc kubenswrapper[4763]: I1006 15:42:03.878999 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://c31a41ce68728830fa7359c54d15315e66f49d457c4e9bc1cdb4e6db0c9ba3c3" gracePeriod=600 Oct 06 15:42:04 crc kubenswrapper[4763]: I1006 15:42:04.610517 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="c31a41ce68728830fa7359c54d15315e66f49d457c4e9bc1cdb4e6db0c9ba3c3" exitCode=0 Oct 06 15:42:04 crc kubenswrapper[4763]: I1006 15:42:04.610606 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"c31a41ce68728830fa7359c54d15315e66f49d457c4e9bc1cdb4e6db0c9ba3c3"} Oct 06 15:42:04 crc kubenswrapper[4763]: I1006 15:42:04.611023 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2"} Oct 06 15:42:04 crc kubenswrapper[4763]: I1006 15:42:04.611047 4763 scope.go:117] "RemoveContainer" containerID="06941331206e0c9193a7cb87fcfe275f94556b4c4beb7dfd4291a9f2363e96bf" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.251016 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wd5m4"] Oct 06 15:42:08 crc kubenswrapper[4763]: E1006 15:42:08.251831 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerName="extract-utilities" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.251846 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerName="extract-utilities" Oct 06 15:42:08 crc kubenswrapper[4763]: E1006 15:42:08.251878 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerName="registry-server" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.251886 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerName="registry-server" Oct 06 15:42:08 crc kubenswrapper[4763]: E1006 15:42:08.251909 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerName="extract-content" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.251917 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerName="extract-content" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.252081 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff5c055-41e5-44ad-a2d6-6ef4dcb78faf" containerName="registry-server" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.254442 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.260107 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wd5m4"] Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.365780 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqq9v\" (UniqueName: \"kubernetes.io/projected/af045265-1d1b-47d6-ba21-2dd69c731160-kube-api-access-nqq9v\") pod \"redhat-marketplace-wd5m4\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.366039 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-utilities\") pod \"redhat-marketplace-wd5m4\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.366138 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-catalog-content\") pod \"redhat-marketplace-wd5m4\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.468218 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-utilities\") pod \"redhat-marketplace-wd5m4\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.468321 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-catalog-content\") pod \"redhat-marketplace-wd5m4\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.468414 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqq9v\" (UniqueName: \"kubernetes.io/projected/af045265-1d1b-47d6-ba21-2dd69c731160-kube-api-access-nqq9v\") pod \"redhat-marketplace-wd5m4\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.469222 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-catalog-content\") pod \"redhat-marketplace-wd5m4\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.469265 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-utilities\") pod \"redhat-marketplace-wd5m4\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.490335 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqq9v\" (UniqueName: \"kubernetes.io/projected/af045265-1d1b-47d6-ba21-2dd69c731160-kube-api-access-nqq9v\") pod \"redhat-marketplace-wd5m4\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:08 crc kubenswrapper[4763]: I1006 15:42:08.581278 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:09 crc kubenswrapper[4763]: I1006 15:42:09.031304 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wd5m4"] Oct 06 15:42:09 crc kubenswrapper[4763]: W1006 15:42:09.040436 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf045265_1d1b_47d6_ba21_2dd69c731160.slice/crio-b32e0447d4876337a3891f25fcb4b6f5ac063ed15d99ca26cfb3281931d35c5a WatchSource:0}: Error finding container b32e0447d4876337a3891f25fcb4b6f5ac063ed15d99ca26cfb3281931d35c5a: Status 404 returned error can't find the container with id b32e0447d4876337a3891f25fcb4b6f5ac063ed15d99ca26cfb3281931d35c5a Oct 06 15:42:09 crc kubenswrapper[4763]: I1006 15:42:09.653868 4763 generic.go:334] "Generic (PLEG): container finished" podID="af045265-1d1b-47d6-ba21-2dd69c731160" containerID="b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7" exitCode=0 Oct 06 15:42:09 crc kubenswrapper[4763]: I1006 15:42:09.653929 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd5m4" event={"ID":"af045265-1d1b-47d6-ba21-2dd69c731160","Type":"ContainerDied","Data":"b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7"} Oct 06 15:42:09 crc kubenswrapper[4763]: I1006 15:42:09.653990 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd5m4" event={"ID":"af045265-1d1b-47d6-ba21-2dd69c731160","Type":"ContainerStarted","Data":"b32e0447d4876337a3891f25fcb4b6f5ac063ed15d99ca26cfb3281931d35c5a"} Oct 06 15:42:10 crc kubenswrapper[4763]: I1006 15:42:10.663762 4763 generic.go:334] "Generic (PLEG): container finished" podID="af045265-1d1b-47d6-ba21-2dd69c731160" containerID="914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf" exitCode=0 Oct 06 15:42:10 crc kubenswrapper[4763]: I1006 15:42:10.663815 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd5m4" event={"ID":"af045265-1d1b-47d6-ba21-2dd69c731160","Type":"ContainerDied","Data":"914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf"} Oct 06 15:42:11 crc kubenswrapper[4763]: I1006 15:42:11.673609 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd5m4" event={"ID":"af045265-1d1b-47d6-ba21-2dd69c731160","Type":"ContainerStarted","Data":"2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e"} Oct 06 15:42:11 crc kubenswrapper[4763]: I1006 15:42:11.701162 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wd5m4" podStartSLOduration=2.272302282 podStartE2EDuration="3.701134192s" podCreationTimestamp="2025-10-06 15:42:08 +0000 UTC" firstStartedPulling="2025-10-06 15:42:09.659099521 +0000 UTC m=+2926.814392033" lastFinishedPulling="2025-10-06 15:42:11.087931411 +0000 UTC m=+2928.243223943" observedRunningTime="2025-10-06 15:42:11.695632242 +0000 UTC m=+2928.850924754" watchObservedRunningTime="2025-10-06 15:42:11.701134192 +0000 UTC m=+2928.856426714" Oct 06 15:42:18 crc kubenswrapper[4763]: I1006 15:42:18.582297 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:18 crc kubenswrapper[4763]: I1006 15:42:18.582925 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:18 crc kubenswrapper[4763]: I1006 15:42:18.657599 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:18 crc kubenswrapper[4763]: I1006 15:42:18.820889 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:18 crc kubenswrapper[4763]: I1006 15:42:18.895237 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wd5m4"] Oct 06 15:42:20 crc kubenswrapper[4763]: I1006 15:42:20.772463 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wd5m4" podUID="af045265-1d1b-47d6-ba21-2dd69c731160" containerName="registry-server" containerID="cri-o://2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e" gracePeriod=2 Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.186310 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.361442 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-utilities\") pod \"af045265-1d1b-47d6-ba21-2dd69c731160\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.361603 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqq9v\" (UniqueName: \"kubernetes.io/projected/af045265-1d1b-47d6-ba21-2dd69c731160-kube-api-access-nqq9v\") pod \"af045265-1d1b-47d6-ba21-2dd69c731160\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.361674 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-catalog-content\") pod \"af045265-1d1b-47d6-ba21-2dd69c731160\" (UID: \"af045265-1d1b-47d6-ba21-2dd69c731160\") " Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.363296 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-utilities" (OuterVolumeSpecName: "utilities") pod "af045265-1d1b-47d6-ba21-2dd69c731160" (UID: "af045265-1d1b-47d6-ba21-2dd69c731160"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.367754 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af045265-1d1b-47d6-ba21-2dd69c731160-kube-api-access-nqq9v" (OuterVolumeSpecName: "kube-api-access-nqq9v") pod "af045265-1d1b-47d6-ba21-2dd69c731160" (UID: "af045265-1d1b-47d6-ba21-2dd69c731160"). InnerVolumeSpecName "kube-api-access-nqq9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.377978 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af045265-1d1b-47d6-ba21-2dd69c731160" (UID: "af045265-1d1b-47d6-ba21-2dd69c731160"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.463972 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqq9v\" (UniqueName: \"kubernetes.io/projected/af045265-1d1b-47d6-ba21-2dd69c731160-kube-api-access-nqq9v\") on node \"crc\" DevicePath \"\"" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.464009 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.464018 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af045265-1d1b-47d6-ba21-2dd69c731160-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.780404 4763 generic.go:334] "Generic (PLEG): container finished" podID="af045265-1d1b-47d6-ba21-2dd69c731160" containerID="2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e" exitCode=0 Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.780456 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd5m4" event={"ID":"af045265-1d1b-47d6-ba21-2dd69c731160","Type":"ContainerDied","Data":"2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e"} Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.780484 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wd5m4" event={"ID":"af045265-1d1b-47d6-ba21-2dd69c731160","Type":"ContainerDied","Data":"b32e0447d4876337a3891f25fcb4b6f5ac063ed15d99ca26cfb3281931d35c5a"} Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.780505 4763 scope.go:117] "RemoveContainer" containerID="2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.780693 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wd5m4" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.802455 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wd5m4"] Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.804832 4763 scope.go:117] "RemoveContainer" containerID="914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.810865 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wd5m4"] Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.822786 4763 scope.go:117] "RemoveContainer" containerID="b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.863223 4763 scope.go:117] "RemoveContainer" containerID="2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e" Oct 06 15:42:21 crc kubenswrapper[4763]: E1006 15:42:21.863739 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e\": container with ID starting with 2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e not found: ID does not exist" containerID="2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.863793 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e"} err="failed to get container status \"2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e\": rpc error: code = NotFound desc = could not find container \"2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e\": container with ID starting with 2999a4f76d56044c38bcd429d7c4fd1d6896dfcd56b76228679d51222436987e not found: ID does not exist" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.863828 4763 scope.go:117] "RemoveContainer" containerID="914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf" Oct 06 15:42:21 crc kubenswrapper[4763]: E1006 15:42:21.864351 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf\": container with ID starting with 914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf not found: ID does not exist" containerID="914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.864446 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf"} err="failed to get container status \"914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf\": rpc error: code = NotFound desc = could not find container \"914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf\": container with ID starting with 914ecda9186539a9346807a9f4d3ab9f6394cd98dd9b09dcf3d7704ab2523fcf not found: ID does not exist" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.864530 4763 scope.go:117] "RemoveContainer" containerID="b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7" Oct 06 15:42:21 crc kubenswrapper[4763]: E1006 15:42:21.865079 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7\": container with ID starting with b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7 not found: ID does not exist" containerID="b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7" Oct 06 15:42:21 crc kubenswrapper[4763]: I1006 15:42:21.865216 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7"} err="failed to get container status \"b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7\": rpc error: code = NotFound desc = could not find container \"b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7\": container with ID starting with b80a3b90f3060ceaa3e15b1758e335870e65aa836123a21a9b1c0dbc89e07da7 not found: ID does not exist" Oct 06 15:42:23 crc kubenswrapper[4763]: I1006 15:42:23.586185 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af045265-1d1b-47d6-ba21-2dd69c731160" path="/var/lib/kubelet/pods/af045265-1d1b-47d6-ba21-2dd69c731160/volumes" Oct 06 15:44:01 crc kubenswrapper[4763]: I1006 15:44:01.776858 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ptxv9"] Oct 06 15:44:01 crc kubenswrapper[4763]: E1006 15:44:01.777890 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af045265-1d1b-47d6-ba21-2dd69c731160" containerName="extract-content" Oct 06 15:44:01 crc kubenswrapper[4763]: I1006 15:44:01.777912 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="af045265-1d1b-47d6-ba21-2dd69c731160" containerName="extract-content" Oct 06 15:44:01 crc kubenswrapper[4763]: E1006 15:44:01.777941 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af045265-1d1b-47d6-ba21-2dd69c731160" containerName="extract-utilities" Oct 06 15:44:01 crc kubenswrapper[4763]: I1006 15:44:01.777959 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="af045265-1d1b-47d6-ba21-2dd69c731160" containerName="extract-utilities" Oct 06 15:44:01 crc kubenswrapper[4763]: E1006 15:44:01.777978 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af045265-1d1b-47d6-ba21-2dd69c731160" containerName="registry-server" Oct 06 15:44:01 crc kubenswrapper[4763]: I1006 15:44:01.777991 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="af045265-1d1b-47d6-ba21-2dd69c731160" containerName="registry-server" Oct 06 15:44:01 crc kubenswrapper[4763]: I1006 15:44:01.778280 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="af045265-1d1b-47d6-ba21-2dd69c731160" containerName="registry-server" Oct 06 15:44:01 crc kubenswrapper[4763]: I1006 15:44:01.780185 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:01 crc kubenswrapper[4763]: I1006 15:44:01.789484 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ptxv9"] Oct 06 15:44:01 crc kubenswrapper[4763]: I1006 15:44:01.921571 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-utilities\") pod \"community-operators-ptxv9\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:01 crc kubenswrapper[4763]: I1006 15:44:01.921701 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rfw\" (UniqueName: \"kubernetes.io/projected/367c03b3-54d1-4068-9738-e939b98cd630-kube-api-access-s9rfw\") pod \"community-operators-ptxv9\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:01 crc kubenswrapper[4763]: I1006 15:44:01.921764 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-catalog-content\") pod \"community-operators-ptxv9\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:02 crc kubenswrapper[4763]: I1006 15:44:02.023409 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9rfw\" (UniqueName: \"kubernetes.io/projected/367c03b3-54d1-4068-9738-e939b98cd630-kube-api-access-s9rfw\") pod \"community-operators-ptxv9\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:02 crc kubenswrapper[4763]: I1006 15:44:02.023465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-catalog-content\") pod \"community-operators-ptxv9\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:02 crc kubenswrapper[4763]: I1006 15:44:02.023529 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-utilities\") pod \"community-operators-ptxv9\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:02 crc kubenswrapper[4763]: I1006 15:44:02.024039 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-utilities\") pod \"community-operators-ptxv9\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:02 crc kubenswrapper[4763]: I1006 15:44:02.024117 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-catalog-content\") pod \"community-operators-ptxv9\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:02 crc kubenswrapper[4763]: I1006 15:44:02.041455 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9rfw\" (UniqueName: \"kubernetes.io/projected/367c03b3-54d1-4068-9738-e939b98cd630-kube-api-access-s9rfw\") pod \"community-operators-ptxv9\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:02 crc kubenswrapper[4763]: I1006 15:44:02.112676 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:02 crc kubenswrapper[4763]: I1006 15:44:02.603001 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ptxv9"] Oct 06 15:44:02 crc kubenswrapper[4763]: I1006 15:44:02.681978 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptxv9" event={"ID":"367c03b3-54d1-4068-9738-e939b98cd630","Type":"ContainerStarted","Data":"ee0d2aa2549c224392d6df7b45dddd035d8932009de2d53c68dcbd084b282ed6"} Oct 06 15:44:03 crc kubenswrapper[4763]: I1006 15:44:03.698439 4763 generic.go:334] "Generic (PLEG): container finished" podID="367c03b3-54d1-4068-9738-e939b98cd630" containerID="5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772" exitCode=0 Oct 06 15:44:03 crc kubenswrapper[4763]: I1006 15:44:03.698502 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptxv9" event={"ID":"367c03b3-54d1-4068-9738-e939b98cd630","Type":"ContainerDied","Data":"5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772"} Oct 06 15:44:04 crc kubenswrapper[4763]: I1006 15:44:04.707390 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptxv9" event={"ID":"367c03b3-54d1-4068-9738-e939b98cd630","Type":"ContainerStarted","Data":"094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9"} Oct 06 15:44:05 crc kubenswrapper[4763]: I1006 15:44:05.719008 4763 generic.go:334] "Generic (PLEG): container finished" podID="367c03b3-54d1-4068-9738-e939b98cd630" containerID="094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9" exitCode=0 Oct 06 15:44:05 crc kubenswrapper[4763]: I1006 15:44:05.719092 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptxv9" event={"ID":"367c03b3-54d1-4068-9738-e939b98cd630","Type":"ContainerDied","Data":"094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9"} Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.566001 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zmls6"] Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.567604 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.595067 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmls6"] Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.691287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-catalog-content\") pod \"redhat-operators-zmls6\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.691466 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-utilities\") pod \"redhat-operators-zmls6\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.691529 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqnvv\" (UniqueName: \"kubernetes.io/projected/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-kube-api-access-dqnvv\") pod \"redhat-operators-zmls6\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.728038 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptxv9" event={"ID":"367c03b3-54d1-4068-9738-e939b98cd630","Type":"ContainerStarted","Data":"f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc"} Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.758893 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ptxv9" podStartSLOduration=3.349464667 podStartE2EDuration="5.758867557s" podCreationTimestamp="2025-10-06 15:44:01 +0000 UTC" firstStartedPulling="2025-10-06 15:44:03.701270314 +0000 UTC m=+3040.856562866" lastFinishedPulling="2025-10-06 15:44:06.110673204 +0000 UTC m=+3043.265965756" observedRunningTime="2025-10-06 15:44:06.754328554 +0000 UTC m=+3043.909621066" watchObservedRunningTime="2025-10-06 15:44:06.758867557 +0000 UTC m=+3043.914160069" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.793249 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-catalog-content\") pod \"redhat-operators-zmls6\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.793363 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-utilities\") pod \"redhat-operators-zmls6\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.793395 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqnvv\" (UniqueName: \"kubernetes.io/projected/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-kube-api-access-dqnvv\") pod \"redhat-operators-zmls6\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.793880 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-catalog-content\") pod \"redhat-operators-zmls6\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.793969 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-utilities\") pod \"redhat-operators-zmls6\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.828790 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqnvv\" (UniqueName: \"kubernetes.io/projected/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-kube-api-access-dqnvv\") pod \"redhat-operators-zmls6\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:06 crc kubenswrapper[4763]: I1006 15:44:06.886482 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:07 crc kubenswrapper[4763]: I1006 15:44:07.379159 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmls6"] Oct 06 15:44:07 crc kubenswrapper[4763]: I1006 15:44:07.738478 4763 generic.go:334] "Generic (PLEG): container finished" podID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerID="60a033108ce9adbef2b428460c98115670202dcee4e455558fb479804457b68f" exitCode=0 Oct 06 15:44:07 crc kubenswrapper[4763]: I1006 15:44:07.738768 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmls6" event={"ID":"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d","Type":"ContainerDied","Data":"60a033108ce9adbef2b428460c98115670202dcee4e455558fb479804457b68f"} Oct 06 15:44:07 crc kubenswrapper[4763]: I1006 15:44:07.739104 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmls6" event={"ID":"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d","Type":"ContainerStarted","Data":"3b587fc6f72ce3f4145c0c305297aa2d4123f16e2522ad3e81e843f16420588a"} Oct 06 15:44:08 crc kubenswrapper[4763]: I1006 15:44:08.748033 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmls6" event={"ID":"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d","Type":"ContainerStarted","Data":"e777021f739c48bab59a70b2786cddbd282511f34a0ecf95a2ada8f9ed1ced56"} Oct 06 15:44:09 crc kubenswrapper[4763]: I1006 15:44:09.765193 4763 generic.go:334] "Generic (PLEG): container finished" podID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerID="e777021f739c48bab59a70b2786cddbd282511f34a0ecf95a2ada8f9ed1ced56" exitCode=0 Oct 06 15:44:09 crc kubenswrapper[4763]: I1006 15:44:09.765347 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmls6" event={"ID":"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d","Type":"ContainerDied","Data":"e777021f739c48bab59a70b2786cddbd282511f34a0ecf95a2ada8f9ed1ced56"} Oct 06 15:44:10 crc kubenswrapper[4763]: I1006 15:44:10.774374 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmls6" event={"ID":"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d","Type":"ContainerStarted","Data":"03e82a326d256e7c204536980ed1f5093beefbc4abf9ed01fc5934da4065d56b"} Oct 06 15:44:10 crc kubenswrapper[4763]: I1006 15:44:10.796327 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zmls6" podStartSLOduration=2.253221201 podStartE2EDuration="4.79630934s" podCreationTimestamp="2025-10-06 15:44:06 +0000 UTC" firstStartedPulling="2025-10-06 15:44:07.740354052 +0000 UTC m=+3044.895646564" lastFinishedPulling="2025-10-06 15:44:10.283442151 +0000 UTC m=+3047.438734703" observedRunningTime="2025-10-06 15:44:10.794516572 +0000 UTC m=+3047.949809084" watchObservedRunningTime="2025-10-06 15:44:10.79630934 +0000 UTC m=+3047.951601852" Oct 06 15:44:12 crc kubenswrapper[4763]: I1006 15:44:12.113667 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:12 crc kubenswrapper[4763]: I1006 15:44:12.114043 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:12 crc kubenswrapper[4763]: I1006 15:44:12.161940 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:12 crc kubenswrapper[4763]: I1006 15:44:12.829234 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.160491 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ptxv9"] Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.161031 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ptxv9" podUID="367c03b3-54d1-4068-9738-e939b98cd630" containerName="registry-server" containerID="cri-o://f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc" gracePeriod=2 Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.514421 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.646238 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-utilities\") pod \"367c03b3-54d1-4068-9738-e939b98cd630\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.646418 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-catalog-content\") pod \"367c03b3-54d1-4068-9738-e939b98cd630\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.646455 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9rfw\" (UniqueName: \"kubernetes.io/projected/367c03b3-54d1-4068-9738-e939b98cd630-kube-api-access-s9rfw\") pod \"367c03b3-54d1-4068-9738-e939b98cd630\" (UID: \"367c03b3-54d1-4068-9738-e939b98cd630\") " Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.647168 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-utilities" (OuterVolumeSpecName: "utilities") pod "367c03b3-54d1-4068-9738-e939b98cd630" (UID: "367c03b3-54d1-4068-9738-e939b98cd630"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.655520 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367c03b3-54d1-4068-9738-e939b98cd630-kube-api-access-s9rfw" (OuterVolumeSpecName: "kube-api-access-s9rfw") pod "367c03b3-54d1-4068-9738-e939b98cd630" (UID: "367c03b3-54d1-4068-9738-e939b98cd630"). InnerVolumeSpecName "kube-api-access-s9rfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.692695 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "367c03b3-54d1-4068-9738-e939b98cd630" (UID: "367c03b3-54d1-4068-9738-e939b98cd630"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.748724 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.748765 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9rfw\" (UniqueName: \"kubernetes.io/projected/367c03b3-54d1-4068-9738-e939b98cd630-kube-api-access-s9rfw\") on node \"crc\" DevicePath \"\"" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.748779 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367c03b3-54d1-4068-9738-e939b98cd630-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.829946 4763 generic.go:334] "Generic (PLEG): container finished" podID="367c03b3-54d1-4068-9738-e939b98cd630" containerID="f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc" exitCode=0 Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.830010 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptxv9" event={"ID":"367c03b3-54d1-4068-9738-e939b98cd630","Type":"ContainerDied","Data":"f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc"} Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.830051 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptxv9" event={"ID":"367c03b3-54d1-4068-9738-e939b98cd630","Type":"ContainerDied","Data":"ee0d2aa2549c224392d6df7b45dddd035d8932009de2d53c68dcbd084b282ed6"} Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.830066 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptxv9" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.830078 4763 scope.go:117] "RemoveContainer" containerID="f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.861514 4763 scope.go:117] "RemoveContainer" containerID="094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.862887 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ptxv9"] Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.867385 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ptxv9"] Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.880965 4763 scope.go:117] "RemoveContainer" containerID="5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.886602 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.886691 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.900499 4763 scope.go:117] "RemoveContainer" containerID="f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc" Oct 06 15:44:16 crc kubenswrapper[4763]: E1006 15:44:16.900849 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc\": container with ID starting with f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc not found: ID does not exist" containerID="f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.900877 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc"} err="failed to get container status \"f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc\": rpc error: code = NotFound desc = could not find container \"f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc\": container with ID starting with f2c0d043d2b88e1a308a727c021112e3aaf08ec3c97e38bad6c7f0d6c8bdbcfc not found: ID does not exist" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.900895 4763 scope.go:117] "RemoveContainer" containerID="094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9" Oct 06 15:44:16 crc kubenswrapper[4763]: E1006 15:44:16.901108 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9\": container with ID starting with 094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9 not found: ID does not exist" containerID="094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.901125 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9"} err="failed to get container status \"094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9\": rpc error: code = NotFound desc = could not find container \"094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9\": container with ID starting with 094fcb43856a2cb4fa83f96f6e9154ce7df8968a69342760b1a485ee601cf2f9 not found: ID does not exist" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.901138 4763 scope.go:117] "RemoveContainer" containerID="5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772" Oct 06 15:44:16 crc kubenswrapper[4763]: E1006 15:44:16.901436 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772\": container with ID starting with 5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772 not found: ID does not exist" containerID="5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.901546 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772"} err="failed to get container status \"5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772\": rpc error: code = NotFound desc = could not find container \"5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772\": container with ID starting with 5e42d043b24cdccfacbe2261ba9fca2d0dfd3243ea85bce4eb30afa6df95d772 not found: ID does not exist" Oct 06 15:44:16 crc kubenswrapper[4763]: I1006 15:44:16.928266 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:17 crc kubenswrapper[4763]: I1006 15:44:17.585230 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367c03b3-54d1-4068-9738-e939b98cd630" path="/var/lib/kubelet/pods/367c03b3-54d1-4068-9738-e939b98cd630/volumes" Oct 06 15:44:17 crc kubenswrapper[4763]: I1006 15:44:17.878246 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:19 crc kubenswrapper[4763]: I1006 15:44:19.160115 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmls6"] Oct 06 15:44:19 crc kubenswrapper[4763]: I1006 15:44:19.852781 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zmls6" podUID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerName="registry-server" containerID="cri-o://03e82a326d256e7c204536980ed1f5093beefbc4abf9ed01fc5934da4065d56b" gracePeriod=2 Oct 06 15:44:20 crc kubenswrapper[4763]: I1006 15:44:20.868910 4763 generic.go:334] "Generic (PLEG): container finished" podID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerID="03e82a326d256e7c204536980ed1f5093beefbc4abf9ed01fc5934da4065d56b" exitCode=0 Oct 06 15:44:20 crc kubenswrapper[4763]: I1006 15:44:20.869223 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmls6" event={"ID":"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d","Type":"ContainerDied","Data":"03e82a326d256e7c204536980ed1f5093beefbc4abf9ed01fc5934da4065d56b"} Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.019680 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.212509 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqnvv\" (UniqueName: \"kubernetes.io/projected/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-kube-api-access-dqnvv\") pod \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.212594 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-catalog-content\") pod \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.212648 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-utilities\") pod \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\" (UID: \"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d\") " Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.213819 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-utilities" (OuterVolumeSpecName: "utilities") pod "7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" (UID: "7841a57d-91a0-4fc1-9fbe-88b1ef12b78d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.220812 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-kube-api-access-dqnvv" (OuterVolumeSpecName: "kube-api-access-dqnvv") pod "7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" (UID: "7841a57d-91a0-4fc1-9fbe-88b1ef12b78d"). InnerVolumeSpecName "kube-api-access-dqnvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.303077 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" (UID: "7841a57d-91a0-4fc1-9fbe-88b1ef12b78d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.314848 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqnvv\" (UniqueName: \"kubernetes.io/projected/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-kube-api-access-dqnvv\") on node \"crc\" DevicePath \"\"" Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.314874 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.314885 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.885394 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmls6" event={"ID":"7841a57d-91a0-4fc1-9fbe-88b1ef12b78d","Type":"ContainerDied","Data":"3b587fc6f72ce3f4145c0c305297aa2d4123f16e2522ad3e81e843f16420588a"} Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.885497 4763 scope.go:117] "RemoveContainer" containerID="03e82a326d256e7c204536980ed1f5093beefbc4abf9ed01fc5934da4065d56b" Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.885520 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmls6" Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.914788 4763 scope.go:117] "RemoveContainer" containerID="e777021f739c48bab59a70b2786cddbd282511f34a0ecf95a2ada8f9ed1ced56" Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.915137 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmls6"] Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.923583 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zmls6"] Oct 06 15:44:21 crc kubenswrapper[4763]: I1006 15:44:21.940703 4763 scope.go:117] "RemoveContainer" containerID="60a033108ce9adbef2b428460c98115670202dcee4e455558fb479804457b68f" Oct 06 15:44:23 crc kubenswrapper[4763]: I1006 15:44:23.585544 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" path="/var/lib/kubelet/pods/7841a57d-91a0-4fc1-9fbe-88b1ef12b78d/volumes" Oct 06 15:44:33 crc kubenswrapper[4763]: I1006 15:44:33.877010 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:44:33 crc kubenswrapper[4763]: I1006 15:44:33.878010 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.214337 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl"] Oct 06 15:45:00 crc kubenswrapper[4763]: E1006 15:45:00.215309 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerName="registry-server" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.215325 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerName="registry-server" Oct 06 15:45:00 crc kubenswrapper[4763]: E1006 15:45:00.215339 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367c03b3-54d1-4068-9738-e939b98cd630" containerName="registry-server" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.215344 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="367c03b3-54d1-4068-9738-e939b98cd630" containerName="registry-server" Oct 06 15:45:00 crc kubenswrapper[4763]: E1006 15:45:00.215353 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367c03b3-54d1-4068-9738-e939b98cd630" containerName="extract-content" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.215359 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="367c03b3-54d1-4068-9738-e939b98cd630" containerName="extract-content" Oct 06 15:45:00 crc kubenswrapper[4763]: E1006 15:45:00.215370 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerName="extract-utilities" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.215376 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerName="extract-utilities" Oct 06 15:45:00 crc kubenswrapper[4763]: E1006 15:45:00.215391 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerName="extract-content" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.215397 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerName="extract-content" Oct 06 15:45:00 crc kubenswrapper[4763]: E1006 15:45:00.215410 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367c03b3-54d1-4068-9738-e939b98cd630" containerName="extract-utilities" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.215416 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="367c03b3-54d1-4068-9738-e939b98cd630" containerName="extract-utilities" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.215534 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="367c03b3-54d1-4068-9738-e939b98cd630" containerName="registry-server" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.215551 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7841a57d-91a0-4fc1-9fbe-88b1ef12b78d" containerName="registry-server" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.216093 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.218272 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.218325 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.219905 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl"] Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.334690 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-secret-volume\") pod \"collect-profiles-29329425-snntl\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.334766 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glmv2\" (UniqueName: \"kubernetes.io/projected/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-kube-api-access-glmv2\") pod \"collect-profiles-29329425-snntl\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.335121 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-config-volume\") pod \"collect-profiles-29329425-snntl\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.437197 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-secret-volume\") pod \"collect-profiles-29329425-snntl\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.437315 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glmv2\" (UniqueName: \"kubernetes.io/projected/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-kube-api-access-glmv2\") pod \"collect-profiles-29329425-snntl\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.437378 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-config-volume\") pod \"collect-profiles-29329425-snntl\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.439105 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-config-volume\") pod \"collect-profiles-29329425-snntl\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.444600 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-secret-volume\") pod \"collect-profiles-29329425-snntl\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.455091 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glmv2\" (UniqueName: \"kubernetes.io/projected/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-kube-api-access-glmv2\") pod \"collect-profiles-29329425-snntl\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.538857 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:00 crc kubenswrapper[4763]: I1006 15:45:00.959451 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl"] Oct 06 15:45:01 crc kubenswrapper[4763]: I1006 15:45:01.241410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" event={"ID":"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1","Type":"ContainerStarted","Data":"79fce7244c5f011bdb4c39676f0c42d9d25cf68d150a30d747df6292b4de1e6e"} Oct 06 15:45:01 crc kubenswrapper[4763]: I1006 15:45:01.241461 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" event={"ID":"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1","Type":"ContainerStarted","Data":"4609eeba624757912a05bea6f5a4bf9a64734e0726565ac63fb1119f3fa79937"} Oct 06 15:45:01 crc kubenswrapper[4763]: I1006 15:45:01.263073 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" podStartSLOduration=1.263054602 podStartE2EDuration="1.263054602s" podCreationTimestamp="2025-10-06 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 15:45:01.257972154 +0000 UTC m=+3098.413264696" watchObservedRunningTime="2025-10-06 15:45:01.263054602 +0000 UTC m=+3098.418347114" Oct 06 15:45:02 crc kubenswrapper[4763]: I1006 15:45:02.251534 4763 generic.go:334] "Generic (PLEG): container finished" podID="6d57bdf3-3c07-45d6-90bd-ccf26b207bf1" containerID="79fce7244c5f011bdb4c39676f0c42d9d25cf68d150a30d747df6292b4de1e6e" exitCode=0 Oct 06 15:45:02 crc kubenswrapper[4763]: I1006 15:45:02.251720 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" event={"ID":"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1","Type":"ContainerDied","Data":"79fce7244c5f011bdb4c39676f0c42d9d25cf68d150a30d747df6292b4de1e6e"} Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.514794 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.684928 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glmv2\" (UniqueName: \"kubernetes.io/projected/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-kube-api-access-glmv2\") pod \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.685016 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-config-volume\") pod \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.685095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-secret-volume\") pod \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\" (UID: \"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1\") " Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.686037 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-config-volume" (OuterVolumeSpecName: "config-volume") pod "6d57bdf3-3c07-45d6-90bd-ccf26b207bf1" (UID: "6d57bdf3-3c07-45d6-90bd-ccf26b207bf1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.695833 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-kube-api-access-glmv2" (OuterVolumeSpecName: "kube-api-access-glmv2") pod "6d57bdf3-3c07-45d6-90bd-ccf26b207bf1" (UID: "6d57bdf3-3c07-45d6-90bd-ccf26b207bf1"). InnerVolumeSpecName "kube-api-access-glmv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.695896 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6d57bdf3-3c07-45d6-90bd-ccf26b207bf1" (UID: "6d57bdf3-3c07-45d6-90bd-ccf26b207bf1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.786847 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glmv2\" (UniqueName: \"kubernetes.io/projected/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-kube-api-access-glmv2\") on node \"crc\" DevicePath \"\"" Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.786892 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.786904 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.877023 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:45:03 crc kubenswrapper[4763]: I1006 15:45:03.877091 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:45:04 crc kubenswrapper[4763]: I1006 15:45:04.278222 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" event={"ID":"6d57bdf3-3c07-45d6-90bd-ccf26b207bf1","Type":"ContainerDied","Data":"4609eeba624757912a05bea6f5a4bf9a64734e0726565ac63fb1119f3fa79937"} Oct 06 15:45:04 crc kubenswrapper[4763]: I1006 15:45:04.278281 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4609eeba624757912a05bea6f5a4bf9a64734e0726565ac63fb1119f3fa79937" Oct 06 15:45:04 crc kubenswrapper[4763]: I1006 15:45:04.278344 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl" Oct 06 15:45:04 crc kubenswrapper[4763]: I1006 15:45:04.345489 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc"] Oct 06 15:45:04 crc kubenswrapper[4763]: I1006 15:45:04.351774 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329380-hpxqc"] Oct 06 15:45:05 crc kubenswrapper[4763]: I1006 15:45:05.584030 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe" path="/var/lib/kubelet/pods/bc156c52-3ff5-4ac5-bfeb-e92bc26bebfe/volumes" Oct 06 15:45:28 crc kubenswrapper[4763]: I1006 15:45:28.139249 4763 scope.go:117] "RemoveContainer" containerID="0ee34648dbf050b997ea9e29f52627bfdcf9dfb106a1a2ac300080c1dc6bf4f5" Oct 06 15:45:33 crc kubenswrapper[4763]: I1006 15:45:33.877003 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:45:33 crc kubenswrapper[4763]: I1006 15:45:33.877525 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:45:33 crc kubenswrapper[4763]: I1006 15:45:33.877600 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:45:33 crc kubenswrapper[4763]: I1006 15:45:33.878823 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:45:33 crc kubenswrapper[4763]: I1006 15:45:33.878954 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" gracePeriod=600 Oct 06 15:45:34 crc kubenswrapper[4763]: E1006 15:45:34.011628 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:45:34 crc kubenswrapper[4763]: I1006 15:45:34.534723 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" exitCode=0 Oct 06 15:45:34 crc kubenswrapper[4763]: I1006 15:45:34.534832 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2"} Oct 06 15:45:34 crc kubenswrapper[4763]: I1006 15:45:34.535182 4763 scope.go:117] "RemoveContainer" containerID="c31a41ce68728830fa7359c54d15315e66f49d457c4e9bc1cdb4e6db0c9ba3c3" Oct 06 15:45:34 crc kubenswrapper[4763]: I1006 15:45:34.536000 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:45:34 crc kubenswrapper[4763]: E1006 15:45:34.536386 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:45:48 crc kubenswrapper[4763]: I1006 15:45:48.574485 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:45:48 crc kubenswrapper[4763]: E1006 15:45:48.575710 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:46:00 crc kubenswrapper[4763]: I1006 15:46:00.574792 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:46:00 crc kubenswrapper[4763]: E1006 15:46:00.575523 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:46:14 crc kubenswrapper[4763]: I1006 15:46:14.574801 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:46:14 crc kubenswrapper[4763]: E1006 15:46:14.575596 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:46:29 crc kubenswrapper[4763]: I1006 15:46:29.575964 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:46:29 crc kubenswrapper[4763]: E1006 15:46:29.577002 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:46:43 crc kubenswrapper[4763]: I1006 15:46:43.581202 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:46:43 crc kubenswrapper[4763]: E1006 15:46:43.581943 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:46:54 crc kubenswrapper[4763]: I1006 15:46:54.575892 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:46:54 crc kubenswrapper[4763]: E1006 15:46:54.577197 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:47:09 crc kubenswrapper[4763]: I1006 15:47:09.575010 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:47:09 crc kubenswrapper[4763]: E1006 15:47:09.576224 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:47:21 crc kubenswrapper[4763]: I1006 15:47:21.575131 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:47:21 crc kubenswrapper[4763]: E1006 15:47:21.575704 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:47:32 crc kubenswrapper[4763]: I1006 15:47:32.575139 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:47:32 crc kubenswrapper[4763]: E1006 15:47:32.575928 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:47:46 crc kubenswrapper[4763]: I1006 15:47:46.575710 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:47:46 crc kubenswrapper[4763]: E1006 15:47:46.576417 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:48:00 crc kubenswrapper[4763]: I1006 15:48:00.575152 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:48:00 crc kubenswrapper[4763]: E1006 15:48:00.576161 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:48:11 crc kubenswrapper[4763]: I1006 15:48:11.575426 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:48:11 crc kubenswrapper[4763]: E1006 15:48:11.576833 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:48:24 crc kubenswrapper[4763]: I1006 15:48:24.574886 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:48:24 crc kubenswrapper[4763]: E1006 15:48:24.575894 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:48:39 crc kubenswrapper[4763]: I1006 15:48:39.575829 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:48:39 crc kubenswrapper[4763]: E1006 15:48:39.576725 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:48:54 crc kubenswrapper[4763]: I1006 15:48:54.576180 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:48:54 crc kubenswrapper[4763]: E1006 15:48:54.577006 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:49:06 crc kubenswrapper[4763]: I1006 15:49:06.574256 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:49:06 crc kubenswrapper[4763]: E1006 15:49:06.574913 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:49:21 crc kubenswrapper[4763]: I1006 15:49:21.575473 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:49:21 crc kubenswrapper[4763]: E1006 15:49:21.576503 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:49:32 crc kubenswrapper[4763]: I1006 15:49:32.575471 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:49:32 crc kubenswrapper[4763]: E1006 15:49:32.576210 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:49:46 crc kubenswrapper[4763]: I1006 15:49:46.575908 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:49:46 crc kubenswrapper[4763]: E1006 15:49:46.576778 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:49:59 crc kubenswrapper[4763]: I1006 15:49:59.575157 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:49:59 crc kubenswrapper[4763]: E1006 15:49:59.575898 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:50:12 crc kubenswrapper[4763]: I1006 15:50:12.575066 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:50:12 crc kubenswrapper[4763]: E1006 15:50:12.575719 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:50:25 crc kubenswrapper[4763]: I1006 15:50:25.575907 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:50:25 crc kubenswrapper[4763]: E1006 15:50:25.576696 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:50:40 crc kubenswrapper[4763]: I1006 15:50:40.575580 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:50:41 crc kubenswrapper[4763]: I1006 15:50:41.127278 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"a532084cd16c21d25ee8c90d42e8fd90ed121de138e29cf41491726989b29490"} Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.455623 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hghkt"] Oct 06 15:51:14 crc kubenswrapper[4763]: E1006 15:51:14.456483 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d57bdf3-3c07-45d6-90bd-ccf26b207bf1" containerName="collect-profiles" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.456497 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d57bdf3-3c07-45d6-90bd-ccf26b207bf1" containerName="collect-profiles" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.456706 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d57bdf3-3c07-45d6-90bd-ccf26b207bf1" containerName="collect-profiles" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.458093 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.481400 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hghkt"] Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.626943 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b96tq\" (UniqueName: \"kubernetes.io/projected/00a6426b-f415-4074-a90b-2f9b629737b5-kube-api-access-b96tq\") pod \"certified-operators-hghkt\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.626989 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-catalog-content\") pod \"certified-operators-hghkt\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.627058 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-utilities\") pod \"certified-operators-hghkt\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.728470 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-utilities\") pod \"certified-operators-hghkt\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.728742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b96tq\" (UniqueName: \"kubernetes.io/projected/00a6426b-f415-4074-a90b-2f9b629737b5-kube-api-access-b96tq\") pod \"certified-operators-hghkt\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.728761 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-catalog-content\") pod \"certified-operators-hghkt\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.729225 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-catalog-content\") pod \"certified-operators-hghkt\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.729228 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-utilities\") pod \"certified-operators-hghkt\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.755701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b96tq\" (UniqueName: \"kubernetes.io/projected/00a6426b-f415-4074-a90b-2f9b629737b5-kube-api-access-b96tq\") pod \"certified-operators-hghkt\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:14 crc kubenswrapper[4763]: I1006 15:51:14.781239 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:15 crc kubenswrapper[4763]: I1006 15:51:15.295956 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hghkt"] Oct 06 15:51:15 crc kubenswrapper[4763]: I1006 15:51:15.412497 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghkt" event={"ID":"00a6426b-f415-4074-a90b-2f9b629737b5","Type":"ContainerStarted","Data":"3f14d9d3316424285d77a50c548f195e8e06f90742946f8b1626f2c668e2e4b4"} Oct 06 15:51:16 crc kubenswrapper[4763]: I1006 15:51:16.425032 4763 generic.go:334] "Generic (PLEG): container finished" podID="00a6426b-f415-4074-a90b-2f9b629737b5" containerID="16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7" exitCode=0 Oct 06 15:51:16 crc kubenswrapper[4763]: I1006 15:51:16.425092 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghkt" event={"ID":"00a6426b-f415-4074-a90b-2f9b629737b5","Type":"ContainerDied","Data":"16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7"} Oct 06 15:51:16 crc kubenswrapper[4763]: I1006 15:51:16.431484 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 15:51:17 crc kubenswrapper[4763]: I1006 15:51:17.433203 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghkt" event={"ID":"00a6426b-f415-4074-a90b-2f9b629737b5","Type":"ContainerStarted","Data":"90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6"} Oct 06 15:51:18 crc kubenswrapper[4763]: I1006 15:51:18.444404 4763 generic.go:334] "Generic (PLEG): container finished" podID="00a6426b-f415-4074-a90b-2f9b629737b5" containerID="90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6" exitCode=0 Oct 06 15:51:18 crc kubenswrapper[4763]: I1006 15:51:18.444455 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghkt" event={"ID":"00a6426b-f415-4074-a90b-2f9b629737b5","Type":"ContainerDied","Data":"90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6"} Oct 06 15:51:19 crc kubenswrapper[4763]: I1006 15:51:19.455542 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghkt" event={"ID":"00a6426b-f415-4074-a90b-2f9b629737b5","Type":"ContainerStarted","Data":"187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124"} Oct 06 15:51:19 crc kubenswrapper[4763]: I1006 15:51:19.478296 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hghkt" podStartSLOduration=2.759631074 podStartE2EDuration="5.478273018s" podCreationTimestamp="2025-10-06 15:51:14 +0000 UTC" firstStartedPulling="2025-10-06 15:51:16.429058103 +0000 UTC m=+3473.584350665" lastFinishedPulling="2025-10-06 15:51:19.147700077 +0000 UTC m=+3476.302992609" observedRunningTime="2025-10-06 15:51:19.472709866 +0000 UTC m=+3476.628002398" watchObservedRunningTime="2025-10-06 15:51:19.478273018 +0000 UTC m=+3476.633565530" Oct 06 15:51:24 crc kubenswrapper[4763]: I1006 15:51:24.781463 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:24 crc kubenswrapper[4763]: I1006 15:51:24.781803 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:24 crc kubenswrapper[4763]: I1006 15:51:24.844691 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:25 crc kubenswrapper[4763]: I1006 15:51:25.592838 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:25 crc kubenswrapper[4763]: I1006 15:51:25.653727 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hghkt"] Oct 06 15:51:27 crc kubenswrapper[4763]: I1006 15:51:27.528041 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hghkt" podUID="00a6426b-f415-4074-a90b-2f9b629737b5" containerName="registry-server" containerID="cri-o://187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124" gracePeriod=2 Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.432420 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.444671 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b96tq\" (UniqueName: \"kubernetes.io/projected/00a6426b-f415-4074-a90b-2f9b629737b5-kube-api-access-b96tq\") pod \"00a6426b-f415-4074-a90b-2f9b629737b5\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.444875 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-catalog-content\") pod \"00a6426b-f415-4074-a90b-2f9b629737b5\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.444908 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-utilities\") pod \"00a6426b-f415-4074-a90b-2f9b629737b5\" (UID: \"00a6426b-f415-4074-a90b-2f9b629737b5\") " Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.445843 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-utilities" (OuterVolumeSpecName: "utilities") pod "00a6426b-f415-4074-a90b-2f9b629737b5" (UID: "00a6426b-f415-4074-a90b-2f9b629737b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.457192 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a6426b-f415-4074-a90b-2f9b629737b5-kube-api-access-b96tq" (OuterVolumeSpecName: "kube-api-access-b96tq") pod "00a6426b-f415-4074-a90b-2f9b629737b5" (UID: "00a6426b-f415-4074-a90b-2f9b629737b5"). InnerVolumeSpecName "kube-api-access-b96tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.501287 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00a6426b-f415-4074-a90b-2f9b629737b5" (UID: "00a6426b-f415-4074-a90b-2f9b629737b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.538465 4763 generic.go:334] "Generic (PLEG): container finished" podID="00a6426b-f415-4074-a90b-2f9b629737b5" containerID="187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124" exitCode=0 Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.538532 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghkt" event={"ID":"00a6426b-f415-4074-a90b-2f9b629737b5","Type":"ContainerDied","Data":"187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124"} Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.538540 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hghkt" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.538556 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghkt" event={"ID":"00a6426b-f415-4074-a90b-2f9b629737b5","Type":"ContainerDied","Data":"3f14d9d3316424285d77a50c548f195e8e06f90742946f8b1626f2c668e2e4b4"} Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.538573 4763 scope.go:117] "RemoveContainer" containerID="187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.546200 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.546229 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a6426b-f415-4074-a90b-2f9b629737b5-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.546242 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b96tq\" (UniqueName: \"kubernetes.io/projected/00a6426b-f415-4074-a90b-2f9b629737b5-kube-api-access-b96tq\") on node \"crc\" DevicePath \"\"" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.565863 4763 scope.go:117] "RemoveContainer" containerID="90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.592837 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hghkt"] Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.597211 4763 scope.go:117] "RemoveContainer" containerID="16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.602548 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hghkt"] Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.624543 4763 scope.go:117] "RemoveContainer" containerID="187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124" Oct 06 15:51:28 crc kubenswrapper[4763]: E1006 15:51:28.625046 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124\": container with ID starting with 187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124 not found: ID does not exist" containerID="187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.625082 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124"} err="failed to get container status \"187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124\": rpc error: code = NotFound desc = could not find container \"187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124\": container with ID starting with 187378fe84eac02415712ce853467d250fbca031283b421adb86748844794124 not found: ID does not exist" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.625108 4763 scope.go:117] "RemoveContainer" containerID="90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6" Oct 06 15:51:28 crc kubenswrapper[4763]: E1006 15:51:28.625595 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6\": container with ID starting with 90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6 not found: ID does not exist" containerID="90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.625644 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6"} err="failed to get container status \"90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6\": rpc error: code = NotFound desc = could not find container \"90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6\": container with ID starting with 90afe0e171bd17b1e3495d7e8cd5fa56232744a9a828f0702efca736ce8eb4c6 not found: ID does not exist" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.625669 4763 scope.go:117] "RemoveContainer" containerID="16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7" Oct 06 15:51:28 crc kubenswrapper[4763]: E1006 15:51:28.626022 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7\": container with ID starting with 16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7 not found: ID does not exist" containerID="16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7" Oct 06 15:51:28 crc kubenswrapper[4763]: I1006 15:51:28.626054 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7"} err="failed to get container status \"16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7\": rpc error: code = NotFound desc = could not find container \"16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7\": container with ID starting with 16c6ae4bbfced77d631e11dcb81c95e8e82fe41d3417dbd616ae4628f2bbaba7 not found: ID does not exist" Oct 06 15:51:29 crc kubenswrapper[4763]: I1006 15:51:29.586439 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a6426b-f415-4074-a90b-2f9b629737b5" path="/var/lib/kubelet/pods/00a6426b-f415-4074-a90b-2f9b629737b5/volumes" Oct 06 15:53:03 crc kubenswrapper[4763]: I1006 15:53:03.877423 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:53:03 crc kubenswrapper[4763]: I1006 15:53:03.877945 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:53:33 crc kubenswrapper[4763]: I1006 15:53:33.876553 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:53:33 crc kubenswrapper[4763]: I1006 15:53:33.877557 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:54:01 crc kubenswrapper[4763]: I1006 15:54:01.756629 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-htl69"] Oct 06 15:54:01 crc kubenswrapper[4763]: E1006 15:54:01.759117 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a6426b-f415-4074-a90b-2f9b629737b5" containerName="extract-utilities" Oct 06 15:54:01 crc kubenswrapper[4763]: I1006 15:54:01.759340 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a6426b-f415-4074-a90b-2f9b629737b5" containerName="extract-utilities" Oct 06 15:54:01 crc kubenswrapper[4763]: E1006 15:54:01.759438 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a6426b-f415-4074-a90b-2f9b629737b5" containerName="registry-server" Oct 06 15:54:01 crc kubenswrapper[4763]: I1006 15:54:01.759515 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a6426b-f415-4074-a90b-2f9b629737b5" containerName="registry-server" Oct 06 15:54:01 crc kubenswrapper[4763]: E1006 15:54:01.759604 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a6426b-f415-4074-a90b-2f9b629737b5" containerName="extract-content" Oct 06 15:54:01 crc kubenswrapper[4763]: I1006 15:54:01.759698 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a6426b-f415-4074-a90b-2f9b629737b5" containerName="extract-content" Oct 06 15:54:01 crc kubenswrapper[4763]: I1006 15:54:01.759941 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a6426b-f415-4074-a90b-2f9b629737b5" containerName="registry-server" Oct 06 15:54:01 crc kubenswrapper[4763]: I1006 15:54:01.763008 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:01 crc kubenswrapper[4763]: I1006 15:54:01.763738 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-htl69"] Oct 06 15:54:01 crc kubenswrapper[4763]: I1006 15:54:01.954195 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc4l5\" (UniqueName: \"kubernetes.io/projected/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-kube-api-access-fc4l5\") pod \"community-operators-htl69\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:01 crc kubenswrapper[4763]: I1006 15:54:01.954560 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-catalog-content\") pod \"community-operators-htl69\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:01 crc kubenswrapper[4763]: I1006 15:54:01.954639 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-utilities\") pod \"community-operators-htl69\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:02 crc kubenswrapper[4763]: I1006 15:54:02.056244 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-catalog-content\") pod \"community-operators-htl69\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:02 crc kubenswrapper[4763]: I1006 15:54:02.056319 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-utilities\") pod \"community-operators-htl69\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:02 crc kubenswrapper[4763]: I1006 15:54:02.056376 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc4l5\" (UniqueName: \"kubernetes.io/projected/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-kube-api-access-fc4l5\") pod \"community-operators-htl69\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:02 crc kubenswrapper[4763]: I1006 15:54:02.056808 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-catalog-content\") pod \"community-operators-htl69\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:02 crc kubenswrapper[4763]: I1006 15:54:02.056891 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-utilities\") pod \"community-operators-htl69\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:02 crc kubenswrapper[4763]: I1006 15:54:02.089661 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc4l5\" (UniqueName: \"kubernetes.io/projected/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-kube-api-access-fc4l5\") pod \"community-operators-htl69\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:02 crc kubenswrapper[4763]: I1006 15:54:02.383000 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:02 crc kubenswrapper[4763]: I1006 15:54:02.801262 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-htl69"] Oct 06 15:54:02 crc kubenswrapper[4763]: I1006 15:54:02.878874 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htl69" event={"ID":"42c15db3-ad99-4e40-8c49-30eb1bd11e6d","Type":"ContainerStarted","Data":"012b434cca11c2b9eebe134d04a2d797f0c2f4c86d4d04f68eccf3c30d72aa4b"} Oct 06 15:54:03 crc kubenswrapper[4763]: I1006 15:54:03.877351 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:54:03 crc kubenswrapper[4763]: I1006 15:54:03.878189 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:54:03 crc kubenswrapper[4763]: I1006 15:54:03.878275 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:54:03 crc kubenswrapper[4763]: I1006 15:54:03.879371 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a532084cd16c21d25ee8c90d42e8fd90ed121de138e29cf41491726989b29490"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:54:03 crc kubenswrapper[4763]: I1006 15:54:03.879472 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://a532084cd16c21d25ee8c90d42e8fd90ed121de138e29cf41491726989b29490" gracePeriod=600 Oct 06 15:54:03 crc kubenswrapper[4763]: I1006 15:54:03.893822 4763 generic.go:334] "Generic (PLEG): container finished" podID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerID="2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992" exitCode=0 Oct 06 15:54:03 crc kubenswrapper[4763]: I1006 15:54:03.893899 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htl69" event={"ID":"42c15db3-ad99-4e40-8c49-30eb1bd11e6d","Type":"ContainerDied","Data":"2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992"} Oct 06 15:54:04 crc kubenswrapper[4763]: I1006 15:54:04.905649 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htl69" event={"ID":"42c15db3-ad99-4e40-8c49-30eb1bd11e6d","Type":"ContainerStarted","Data":"c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823"} Oct 06 15:54:04 crc kubenswrapper[4763]: I1006 15:54:04.908648 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="a532084cd16c21d25ee8c90d42e8fd90ed121de138e29cf41491726989b29490" exitCode=0 Oct 06 15:54:04 crc kubenswrapper[4763]: I1006 15:54:04.908691 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"a532084cd16c21d25ee8c90d42e8fd90ed121de138e29cf41491726989b29490"} Oct 06 15:54:04 crc kubenswrapper[4763]: I1006 15:54:04.908718 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b"} Oct 06 15:54:04 crc kubenswrapper[4763]: I1006 15:54:04.908734 4763 scope.go:117] "RemoveContainer" containerID="1d42f2d0187cba219cd706a76f23c289433b948e49a53fadc98e86c13f4b94a2" Oct 06 15:54:05 crc kubenswrapper[4763]: I1006 15:54:05.922490 4763 generic.go:334] "Generic (PLEG): container finished" podID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerID="c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823" exitCode=0 Oct 06 15:54:05 crc kubenswrapper[4763]: I1006 15:54:05.922567 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htl69" event={"ID":"42c15db3-ad99-4e40-8c49-30eb1bd11e6d","Type":"ContainerDied","Data":"c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823"} Oct 06 15:54:06 crc kubenswrapper[4763]: I1006 15:54:06.941626 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htl69" event={"ID":"42c15db3-ad99-4e40-8c49-30eb1bd11e6d","Type":"ContainerStarted","Data":"f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe"} Oct 06 15:54:06 crc kubenswrapper[4763]: I1006 15:54:06.960095 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-htl69" podStartSLOduration=3.355889894 podStartE2EDuration="5.960073162s" podCreationTimestamp="2025-10-06 15:54:01 +0000 UTC" firstStartedPulling="2025-10-06 15:54:03.897300896 +0000 UTC m=+3641.052593408" lastFinishedPulling="2025-10-06 15:54:06.501484164 +0000 UTC m=+3643.656776676" observedRunningTime="2025-10-06 15:54:06.958237512 +0000 UTC m=+3644.113530024" watchObservedRunningTime="2025-10-06 15:54:06.960073162 +0000 UTC m=+3644.115365684" Oct 06 15:54:12 crc kubenswrapper[4763]: I1006 15:54:12.383790 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:12 crc kubenswrapper[4763]: I1006 15:54:12.384397 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:12 crc kubenswrapper[4763]: I1006 15:54:12.443410 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:13 crc kubenswrapper[4763]: I1006 15:54:13.065465 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:13 crc kubenswrapper[4763]: I1006 15:54:13.124056 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-htl69"] Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.016564 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-htl69" podUID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerName="registry-server" containerID="cri-o://f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe" gracePeriod=2 Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.424147 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.566943 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-catalog-content\") pod \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.567095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc4l5\" (UniqueName: \"kubernetes.io/projected/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-kube-api-access-fc4l5\") pod \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.567123 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-utilities\") pod \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\" (UID: \"42c15db3-ad99-4e40-8c49-30eb1bd11e6d\") " Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.568315 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-utilities" (OuterVolumeSpecName: "utilities") pod "42c15db3-ad99-4e40-8c49-30eb1bd11e6d" (UID: "42c15db3-ad99-4e40-8c49-30eb1bd11e6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.579919 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-kube-api-access-fc4l5" (OuterVolumeSpecName: "kube-api-access-fc4l5") pod "42c15db3-ad99-4e40-8c49-30eb1bd11e6d" (UID: "42c15db3-ad99-4e40-8c49-30eb1bd11e6d"). InnerVolumeSpecName "kube-api-access-fc4l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.668905 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc4l5\" (UniqueName: \"kubernetes.io/projected/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-kube-api-access-fc4l5\") on node \"crc\" DevicePath \"\"" Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.669189 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.676817 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42c15db3-ad99-4e40-8c49-30eb1bd11e6d" (UID: "42c15db3-ad99-4e40-8c49-30eb1bd11e6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:54:15 crc kubenswrapper[4763]: I1006 15:54:15.770967 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c15db3-ad99-4e40-8c49-30eb1bd11e6d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.025796 4763 generic.go:334] "Generic (PLEG): container finished" podID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerID="f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe" exitCode=0 Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.025854 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htl69" Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.025924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htl69" event={"ID":"42c15db3-ad99-4e40-8c49-30eb1bd11e6d","Type":"ContainerDied","Data":"f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe"} Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.027019 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htl69" event={"ID":"42c15db3-ad99-4e40-8c49-30eb1bd11e6d","Type":"ContainerDied","Data":"012b434cca11c2b9eebe134d04a2d797f0c2f4c86d4d04f68eccf3c30d72aa4b"} Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.027069 4763 scope.go:117] "RemoveContainer" containerID="f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe" Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.047926 4763 scope.go:117] "RemoveContainer" containerID="c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823" Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.068647 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-htl69"] Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.069039 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-htl69"] Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.088890 4763 scope.go:117] "RemoveContainer" containerID="2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992" Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.108653 4763 scope.go:117] "RemoveContainer" containerID="f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe" Oct 06 15:54:16 crc kubenswrapper[4763]: E1006 15:54:16.109221 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe\": container with ID starting with f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe not found: ID does not exist" containerID="f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe" Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.109280 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe"} err="failed to get container status \"f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe\": rpc error: code = NotFound desc = could not find container \"f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe\": container with ID starting with f84dbf081f8dd1d99857b0fea86545240ab5ec2720cacfc462847f9e939644fe not found: ID does not exist" Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.109311 4763 scope.go:117] "RemoveContainer" containerID="c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823" Oct 06 15:54:16 crc kubenswrapper[4763]: E1006 15:54:16.109960 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823\": container with ID starting with c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823 not found: ID does not exist" containerID="c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823" Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.110003 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823"} err="failed to get container status \"c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823\": rpc error: code = NotFound desc = could not find container \"c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823\": container with ID starting with c7d9d1dd80fa34049f1d4491424397af9c942b26fdf035598f5f46586b14e823 not found: ID does not exist" Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.110032 4763 scope.go:117] "RemoveContainer" containerID="2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992" Oct 06 15:54:16 crc kubenswrapper[4763]: E1006 15:54:16.110346 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992\": container with ID starting with 2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992 not found: ID does not exist" containerID="2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992" Oct 06 15:54:16 crc kubenswrapper[4763]: I1006 15:54:16.110388 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992"} err="failed to get container status \"2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992\": rpc error: code = NotFound desc = could not find container \"2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992\": container with ID starting with 2ea331913e6443ca55fb7d410d78c133b33296bf37838bceed31fb645fe81992 not found: ID does not exist" Oct 06 15:54:17 crc kubenswrapper[4763]: I1006 15:54:17.591033 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" path="/var/lib/kubelet/pods/42c15db3-ad99-4e40-8c49-30eb1bd11e6d/volumes" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.331961 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wwj77"] Oct 06 15:55:27 crc kubenswrapper[4763]: E1006 15:55:27.333116 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerName="extract-content" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.333139 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerName="extract-content" Oct 06 15:55:27 crc kubenswrapper[4763]: E1006 15:55:27.333156 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerName="registry-server" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.333167 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerName="registry-server" Oct 06 15:55:27 crc kubenswrapper[4763]: E1006 15:55:27.333201 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerName="extract-utilities" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.333214 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerName="extract-utilities" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.333480 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c15db3-ad99-4e40-8c49-30eb1bd11e6d" containerName="registry-server" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.335252 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.349047 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wwj77"] Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.445420 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-utilities\") pod \"redhat-operators-wwj77\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.445779 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-catalog-content\") pod \"redhat-operators-wwj77\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.446090 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-245mv\" (UniqueName: \"kubernetes.io/projected/470fa7af-e49b-40e4-8b93-13ee5c1be591-kube-api-access-245mv\") pod \"redhat-operators-wwj77\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.547773 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-utilities\") pod \"redhat-operators-wwj77\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.548123 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-catalog-content\") pod \"redhat-operators-wwj77\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.548310 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-utilities\") pod \"redhat-operators-wwj77\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.548321 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-245mv\" (UniqueName: \"kubernetes.io/projected/470fa7af-e49b-40e4-8b93-13ee5c1be591-kube-api-access-245mv\") pod \"redhat-operators-wwj77\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.548674 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-catalog-content\") pod \"redhat-operators-wwj77\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.575839 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-245mv\" (UniqueName: \"kubernetes.io/projected/470fa7af-e49b-40e4-8b93-13ee5c1be591-kube-api-access-245mv\") pod \"redhat-operators-wwj77\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:27 crc kubenswrapper[4763]: I1006 15:55:27.672292 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:28 crc kubenswrapper[4763]: I1006 15:55:28.107155 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wwj77"] Oct 06 15:55:28 crc kubenswrapper[4763]: I1006 15:55:28.645156 4763 generic.go:334] "Generic (PLEG): container finished" podID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerID="842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104" exitCode=0 Oct 06 15:55:28 crc kubenswrapper[4763]: I1006 15:55:28.645245 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwj77" event={"ID":"470fa7af-e49b-40e4-8b93-13ee5c1be591","Type":"ContainerDied","Data":"842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104"} Oct 06 15:55:28 crc kubenswrapper[4763]: I1006 15:55:28.645478 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwj77" event={"ID":"470fa7af-e49b-40e4-8b93-13ee5c1be591","Type":"ContainerStarted","Data":"2d54d43f2426e6624cdd759bca7c424a9dd80c782b522fdb02dadf7f981ba7f2"} Oct 06 15:55:30 crc kubenswrapper[4763]: I1006 15:55:30.667192 4763 generic.go:334] "Generic (PLEG): container finished" podID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerID="14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077" exitCode=0 Oct 06 15:55:30 crc kubenswrapper[4763]: I1006 15:55:30.667263 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwj77" event={"ID":"470fa7af-e49b-40e4-8b93-13ee5c1be591","Type":"ContainerDied","Data":"14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077"} Oct 06 15:55:31 crc kubenswrapper[4763]: I1006 15:55:31.675982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwj77" event={"ID":"470fa7af-e49b-40e4-8b93-13ee5c1be591","Type":"ContainerStarted","Data":"5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf"} Oct 06 15:55:31 crc kubenswrapper[4763]: I1006 15:55:31.697156 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wwj77" podStartSLOduration=2.290128343 podStartE2EDuration="4.697139502s" podCreationTimestamp="2025-10-06 15:55:27 +0000 UTC" firstStartedPulling="2025-10-06 15:55:28.647100185 +0000 UTC m=+3725.802392697" lastFinishedPulling="2025-10-06 15:55:31.054111334 +0000 UTC m=+3728.209403856" observedRunningTime="2025-10-06 15:55:31.693947025 +0000 UTC m=+3728.849239537" watchObservedRunningTime="2025-10-06 15:55:31.697139502 +0000 UTC m=+3728.852432014" Oct 06 15:55:37 crc kubenswrapper[4763]: I1006 15:55:37.672674 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:37 crc kubenswrapper[4763]: I1006 15:55:37.673344 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:37 crc kubenswrapper[4763]: I1006 15:55:37.727480 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:37 crc kubenswrapper[4763]: I1006 15:55:37.805489 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:37 crc kubenswrapper[4763]: I1006 15:55:37.981736 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wwj77"] Oct 06 15:55:39 crc kubenswrapper[4763]: I1006 15:55:39.772748 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wwj77" podUID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerName="registry-server" containerID="cri-o://5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf" gracePeriod=2 Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.250397 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.345877 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-catalog-content\") pod \"470fa7af-e49b-40e4-8b93-13ee5c1be591\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.345982 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-utilities\") pod \"470fa7af-e49b-40e4-8b93-13ee5c1be591\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.346069 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-245mv\" (UniqueName: \"kubernetes.io/projected/470fa7af-e49b-40e4-8b93-13ee5c1be591-kube-api-access-245mv\") pod \"470fa7af-e49b-40e4-8b93-13ee5c1be591\" (UID: \"470fa7af-e49b-40e4-8b93-13ee5c1be591\") " Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.348503 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-utilities" (OuterVolumeSpecName: "utilities") pod "470fa7af-e49b-40e4-8b93-13ee5c1be591" (UID: "470fa7af-e49b-40e4-8b93-13ee5c1be591"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.352065 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470fa7af-e49b-40e4-8b93-13ee5c1be591-kube-api-access-245mv" (OuterVolumeSpecName: "kube-api-access-245mv") pod "470fa7af-e49b-40e4-8b93-13ee5c1be591" (UID: "470fa7af-e49b-40e4-8b93-13ee5c1be591"). InnerVolumeSpecName "kube-api-access-245mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.447548 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.447585 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-245mv\" (UniqueName: \"kubernetes.io/projected/470fa7af-e49b-40e4-8b93-13ee5c1be591-kube-api-access-245mv\") on node \"crc\" DevicePath \"\"" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.789971 4763 generic.go:334] "Generic (PLEG): container finished" podID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerID="5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf" exitCode=0 Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.790047 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwj77" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.790048 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwj77" event={"ID":"470fa7af-e49b-40e4-8b93-13ee5c1be591","Type":"ContainerDied","Data":"5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf"} Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.790117 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwj77" event={"ID":"470fa7af-e49b-40e4-8b93-13ee5c1be591","Type":"ContainerDied","Data":"2d54d43f2426e6624cdd759bca7c424a9dd80c782b522fdb02dadf7f981ba7f2"} Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.790151 4763 scope.go:117] "RemoveContainer" containerID="5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.814737 4763 scope.go:117] "RemoveContainer" containerID="14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.832984 4763 scope.go:117] "RemoveContainer" containerID="842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.854706 4763 scope.go:117] "RemoveContainer" containerID="5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf" Oct 06 15:55:40 crc kubenswrapper[4763]: E1006 15:55:40.855294 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf\": container with ID starting with 5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf not found: ID does not exist" containerID="5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.855366 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf"} err="failed to get container status \"5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf\": rpc error: code = NotFound desc = could not find container \"5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf\": container with ID starting with 5962bc2d98cc2f5be80ea41db00210f25be09d8d43f8ea0fa21eaeef88a22fbf not found: ID does not exist" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.855594 4763 scope.go:117] "RemoveContainer" containerID="14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077" Oct 06 15:55:40 crc kubenswrapper[4763]: E1006 15:55:40.856125 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077\": container with ID starting with 14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077 not found: ID does not exist" containerID="14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.856155 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077"} err="failed to get container status \"14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077\": rpc error: code = NotFound desc = could not find container \"14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077\": container with ID starting with 14da2c412e4075e55d9bffdfb81877cefc3894d8e061119ff916154b3924f077 not found: ID does not exist" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.856178 4763 scope.go:117] "RemoveContainer" containerID="842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104" Oct 06 15:55:40 crc kubenswrapper[4763]: E1006 15:55:40.856454 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104\": container with ID starting with 842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104 not found: ID does not exist" containerID="842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.856499 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104"} err="failed to get container status \"842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104\": rpc error: code = NotFound desc = could not find container \"842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104\": container with ID starting with 842f12b554790bdcd94dfa2899517b051b7f4cb6e8827d78c730705b118ae104 not found: ID does not exist" Oct 06 15:55:40 crc kubenswrapper[4763]: I1006 15:55:40.986210 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "470fa7af-e49b-40e4-8b93-13ee5c1be591" (UID: "470fa7af-e49b-40e4-8b93-13ee5c1be591"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:55:41 crc kubenswrapper[4763]: I1006 15:55:41.056897 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470fa7af-e49b-40e4-8b93-13ee5c1be591-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:55:41 crc kubenswrapper[4763]: I1006 15:55:41.136278 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wwj77"] Oct 06 15:55:41 crc kubenswrapper[4763]: I1006 15:55:41.143561 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wwj77"] Oct 06 15:55:41 crc kubenswrapper[4763]: I1006 15:55:41.592061 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="470fa7af-e49b-40e4-8b93-13ee5c1be591" path="/var/lib/kubelet/pods/470fa7af-e49b-40e4-8b93-13ee5c1be591/volumes" Oct 06 15:56:33 crc kubenswrapper[4763]: I1006 15:56:33.877370 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:56:33 crc kubenswrapper[4763]: I1006 15:56:33.878021 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:57:03 crc kubenswrapper[4763]: I1006 15:57:03.877270 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:57:03 crc kubenswrapper[4763]: I1006 15:57:03.877946 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:57:33 crc kubenswrapper[4763]: I1006 15:57:33.876466 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 15:57:33 crc kubenswrapper[4763]: I1006 15:57:33.877268 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 15:57:33 crc kubenswrapper[4763]: I1006 15:57:33.877336 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 15:57:33 crc kubenswrapper[4763]: I1006 15:57:33.878298 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 15:57:33 crc kubenswrapper[4763]: I1006 15:57:33.878393 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" gracePeriod=600 Oct 06 15:57:34 crc kubenswrapper[4763]: E1006 15:57:34.008488 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:57:34 crc kubenswrapper[4763]: I1006 15:57:34.802251 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" exitCode=0 Oct 06 15:57:34 crc kubenswrapper[4763]: I1006 15:57:34.802325 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b"} Oct 06 15:57:34 crc kubenswrapper[4763]: I1006 15:57:34.802398 4763 scope.go:117] "RemoveContainer" containerID="a532084cd16c21d25ee8c90d42e8fd90ed121de138e29cf41491726989b29490" Oct 06 15:57:34 crc kubenswrapper[4763]: I1006 15:57:34.803081 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:57:34 crc kubenswrapper[4763]: E1006 15:57:34.803829 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:57:49 crc kubenswrapper[4763]: I1006 15:57:49.575660 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:57:49 crc kubenswrapper[4763]: E1006 15:57:49.576710 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:57:54 crc kubenswrapper[4763]: I1006 15:57:54.751082 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4vgbc"] Oct 06 15:57:54 crc kubenswrapper[4763]: E1006 15:57:54.751678 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerName="registry-server" Oct 06 15:57:54 crc kubenswrapper[4763]: I1006 15:57:54.751691 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerName="registry-server" Oct 06 15:57:54 crc kubenswrapper[4763]: E1006 15:57:54.751721 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerName="extract-utilities" Oct 06 15:57:54 crc kubenswrapper[4763]: I1006 15:57:54.751730 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerName="extract-utilities" Oct 06 15:57:54 crc kubenswrapper[4763]: E1006 15:57:54.751738 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerName="extract-content" Oct 06 15:57:54 crc kubenswrapper[4763]: I1006 15:57:54.751744 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerName="extract-content" Oct 06 15:57:54 crc kubenswrapper[4763]: I1006 15:57:54.751888 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="470fa7af-e49b-40e4-8b93-13ee5c1be591" containerName="registry-server" Oct 06 15:57:54 crc kubenswrapper[4763]: I1006 15:57:54.752991 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:54 crc kubenswrapper[4763]: I1006 15:57:54.771860 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vgbc"] Oct 06 15:57:54 crc kubenswrapper[4763]: I1006 15:57:54.909884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-catalog-content\") pod \"redhat-marketplace-4vgbc\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:54 crc kubenswrapper[4763]: I1006 15:57:54.909961 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-utilities\") pod \"redhat-marketplace-4vgbc\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:54 crc kubenswrapper[4763]: I1006 15:57:54.910166 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c92df\" (UniqueName: \"kubernetes.io/projected/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-kube-api-access-c92df\") pod \"redhat-marketplace-4vgbc\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.011794 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c92df\" (UniqueName: \"kubernetes.io/projected/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-kube-api-access-c92df\") pod \"redhat-marketplace-4vgbc\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.011857 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-catalog-content\") pod \"redhat-marketplace-4vgbc\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.011890 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-utilities\") pod \"redhat-marketplace-4vgbc\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.012357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-utilities\") pod \"redhat-marketplace-4vgbc\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.012411 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-catalog-content\") pod \"redhat-marketplace-4vgbc\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.042829 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c92df\" (UniqueName: \"kubernetes.io/projected/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-kube-api-access-c92df\") pod \"redhat-marketplace-4vgbc\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.071276 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.516697 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vgbc"] Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.963256 4763 generic.go:334] "Generic (PLEG): container finished" podID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerID="90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de" exitCode=0 Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.963316 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vgbc" event={"ID":"86d26fa8-478c-48d1-b1c7-edf4495a0f0c","Type":"ContainerDied","Data":"90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de"} Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.963758 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vgbc" event={"ID":"86d26fa8-478c-48d1-b1c7-edf4495a0f0c","Type":"ContainerStarted","Data":"5d40e69733889078577a8396cd1c735ea78de9c0ff645358b1568da2dacea9ca"} Oct 06 15:57:55 crc kubenswrapper[4763]: I1006 15:57:55.965799 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 15:57:57 crc kubenswrapper[4763]: I1006 15:57:57.986808 4763 generic.go:334] "Generic (PLEG): container finished" podID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerID="86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815" exitCode=0 Oct 06 15:57:57 crc kubenswrapper[4763]: I1006 15:57:57.986988 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vgbc" event={"ID":"86d26fa8-478c-48d1-b1c7-edf4495a0f0c","Type":"ContainerDied","Data":"86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815"} Oct 06 15:57:58 crc kubenswrapper[4763]: I1006 15:57:58.996844 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vgbc" event={"ID":"86d26fa8-478c-48d1-b1c7-edf4495a0f0c","Type":"ContainerStarted","Data":"f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f"} Oct 06 15:57:59 crc kubenswrapper[4763]: I1006 15:57:59.026211 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4vgbc" podStartSLOduration=2.596156399 podStartE2EDuration="5.026187161s" podCreationTimestamp="2025-10-06 15:57:54 +0000 UTC" firstStartedPulling="2025-10-06 15:57:55.965212861 +0000 UTC m=+3873.120505413" lastFinishedPulling="2025-10-06 15:57:58.395243663 +0000 UTC m=+3875.550536175" observedRunningTime="2025-10-06 15:57:59.019988263 +0000 UTC m=+3876.175280775" watchObservedRunningTime="2025-10-06 15:57:59.026187161 +0000 UTC m=+3876.181479673" Oct 06 15:58:01 crc kubenswrapper[4763]: I1006 15:58:01.575869 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:58:01 crc kubenswrapper[4763]: E1006 15:58:01.576574 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:58:05 crc kubenswrapper[4763]: I1006 15:58:05.072218 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:58:05 crc kubenswrapper[4763]: I1006 15:58:05.072470 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:58:05 crc kubenswrapper[4763]: I1006 15:58:05.124463 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:58:06 crc kubenswrapper[4763]: I1006 15:58:06.098383 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:58:06 crc kubenswrapper[4763]: I1006 15:58:06.145226 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vgbc"] Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.067278 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4vgbc" podUID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerName="registry-server" containerID="cri-o://f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f" gracePeriod=2 Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.496974 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.620166 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-catalog-content\") pod \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.620253 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-utilities\") pod \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.620409 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c92df\" (UniqueName: \"kubernetes.io/projected/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-kube-api-access-c92df\") pod \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\" (UID: \"86d26fa8-478c-48d1-b1c7-edf4495a0f0c\") " Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.622146 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-utilities" (OuterVolumeSpecName: "utilities") pod "86d26fa8-478c-48d1-b1c7-edf4495a0f0c" (UID: "86d26fa8-478c-48d1-b1c7-edf4495a0f0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.633126 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86d26fa8-478c-48d1-b1c7-edf4495a0f0c" (UID: "86d26fa8-478c-48d1-b1c7-edf4495a0f0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.634329 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-kube-api-access-c92df" (OuterVolumeSpecName: "kube-api-access-c92df") pod "86d26fa8-478c-48d1-b1c7-edf4495a0f0c" (UID: "86d26fa8-478c-48d1-b1c7-edf4495a0f0c"). InnerVolumeSpecName "kube-api-access-c92df". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.723006 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.723178 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 15:58:08 crc kubenswrapper[4763]: I1006 15:58:08.723194 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c92df\" (UniqueName: \"kubernetes.io/projected/86d26fa8-478c-48d1-b1c7-edf4495a0f0c-kube-api-access-c92df\") on node \"crc\" DevicePath \"\"" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.079692 4763 generic.go:334] "Generic (PLEG): container finished" podID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerID="f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f" exitCode=0 Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.079767 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vgbc" event={"ID":"86d26fa8-478c-48d1-b1c7-edf4495a0f0c","Type":"ContainerDied","Data":"f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f"} Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.079819 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vgbc" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.079843 4763 scope.go:117] "RemoveContainer" containerID="f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.079827 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vgbc" event={"ID":"86d26fa8-478c-48d1-b1c7-edf4495a0f0c","Type":"ContainerDied","Data":"5d40e69733889078577a8396cd1c735ea78de9c0ff645358b1568da2dacea9ca"} Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.103639 4763 scope.go:117] "RemoveContainer" containerID="86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.119647 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vgbc"] Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.127799 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vgbc"] Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.149580 4763 scope.go:117] "RemoveContainer" containerID="90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.179258 4763 scope.go:117] "RemoveContainer" containerID="f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f" Oct 06 15:58:09 crc kubenswrapper[4763]: E1006 15:58:09.179961 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f\": container with ID starting with f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f not found: ID does not exist" containerID="f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.180003 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f"} err="failed to get container status \"f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f\": rpc error: code = NotFound desc = could not find container \"f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f\": container with ID starting with f7b8e06dd4d817883be8d39e9d3da768a610abe26d952ef9aad5cf7fb253d66f not found: ID does not exist" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.180028 4763 scope.go:117] "RemoveContainer" containerID="86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815" Oct 06 15:58:09 crc kubenswrapper[4763]: E1006 15:58:09.180344 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815\": container with ID starting with 86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815 not found: ID does not exist" containerID="86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.180371 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815"} err="failed to get container status \"86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815\": rpc error: code = NotFound desc = could not find container \"86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815\": container with ID starting with 86c3092d6a308eabb82a172c2620271a852fb2a050be86a36bfcae7ff17d9815 not found: ID does not exist" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.180383 4763 scope.go:117] "RemoveContainer" containerID="90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de" Oct 06 15:58:09 crc kubenswrapper[4763]: E1006 15:58:09.180698 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de\": container with ID starting with 90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de not found: ID does not exist" containerID="90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.180751 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de"} err="failed to get container status \"90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de\": rpc error: code = NotFound desc = could not find container \"90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de\": container with ID starting with 90cd3dc1cf5a85de8f82adfc257123545dcd1251becf47d8531b0850ea3653de not found: ID does not exist" Oct 06 15:58:09 crc kubenswrapper[4763]: I1006 15:58:09.589101 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" path="/var/lib/kubelet/pods/86d26fa8-478c-48d1-b1c7-edf4495a0f0c/volumes" Oct 06 15:58:12 crc kubenswrapper[4763]: I1006 15:58:12.575199 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:58:12 crc kubenswrapper[4763]: E1006 15:58:12.575928 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:58:26 crc kubenswrapper[4763]: I1006 15:58:26.575658 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:58:26 crc kubenswrapper[4763]: E1006 15:58:26.576861 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:58:37 crc kubenswrapper[4763]: I1006 15:58:37.574854 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:58:37 crc kubenswrapper[4763]: E1006 15:58:37.575559 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:58:51 crc kubenswrapper[4763]: I1006 15:58:51.575298 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:58:51 crc kubenswrapper[4763]: E1006 15:58:51.576111 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:59:02 crc kubenswrapper[4763]: I1006 15:59:02.575788 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:59:02 crc kubenswrapper[4763]: E1006 15:59:02.578079 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:59:16 crc kubenswrapper[4763]: I1006 15:59:16.575488 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:59:16 crc kubenswrapper[4763]: E1006 15:59:16.576729 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:59:31 crc kubenswrapper[4763]: I1006 15:59:31.575974 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:59:31 crc kubenswrapper[4763]: E1006 15:59:31.577169 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:59:46 crc kubenswrapper[4763]: I1006 15:59:46.575199 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:59:46 crc kubenswrapper[4763]: E1006 15:59:46.576200 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 15:59:59 crc kubenswrapper[4763]: I1006 15:59:59.574958 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 15:59:59 crc kubenswrapper[4763]: E1006 15:59:59.575943 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.159132 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck"] Oct 06 16:00:00 crc kubenswrapper[4763]: E1006 16:00:00.159565 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerName="extract-content" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.159593 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerName="extract-content" Oct 06 16:00:00 crc kubenswrapper[4763]: E1006 16:00:00.159611 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerName="registry-server" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.159645 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerName="registry-server" Oct 06 16:00:00 crc kubenswrapper[4763]: E1006 16:00:00.159663 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerName="extract-utilities" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.159673 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerName="extract-utilities" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.159900 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="86d26fa8-478c-48d1-b1c7-edf4495a0f0c" containerName="registry-server" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.160635 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.162763 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.163134 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.179670 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck"] Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.265173 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf38ef16-1123-4994-aea6-fcf552779957-secret-volume\") pod \"collect-profiles-29329440-dxjck\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.265232 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmrbp\" (UniqueName: \"kubernetes.io/projected/cf38ef16-1123-4994-aea6-fcf552779957-kube-api-access-wmrbp\") pod \"collect-profiles-29329440-dxjck\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.265291 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf38ef16-1123-4994-aea6-fcf552779957-config-volume\") pod \"collect-profiles-29329440-dxjck\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.366085 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf38ef16-1123-4994-aea6-fcf552779957-secret-volume\") pod \"collect-profiles-29329440-dxjck\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.366146 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmrbp\" (UniqueName: \"kubernetes.io/projected/cf38ef16-1123-4994-aea6-fcf552779957-kube-api-access-wmrbp\") pod \"collect-profiles-29329440-dxjck\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.366204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf38ef16-1123-4994-aea6-fcf552779957-config-volume\") pod \"collect-profiles-29329440-dxjck\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.368075 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf38ef16-1123-4994-aea6-fcf552779957-config-volume\") pod \"collect-profiles-29329440-dxjck\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.384863 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmrbp\" (UniqueName: \"kubernetes.io/projected/cf38ef16-1123-4994-aea6-fcf552779957-kube-api-access-wmrbp\") pod \"collect-profiles-29329440-dxjck\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.386241 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf38ef16-1123-4994-aea6-fcf552779957-secret-volume\") pod \"collect-profiles-29329440-dxjck\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.486762 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:00 crc kubenswrapper[4763]: I1006 16:00:00.945357 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck"] Oct 06 16:00:00 crc kubenswrapper[4763]: W1006 16:00:00.950724 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf38ef16_1123_4994_aea6_fcf552779957.slice/crio-2ac2264f6c65f2f11cf79af0e2798e1b6ce5392ba96c1686b9005c4e9ba6c7de WatchSource:0}: Error finding container 2ac2264f6c65f2f11cf79af0e2798e1b6ce5392ba96c1686b9005c4e9ba6c7de: Status 404 returned error can't find the container with id 2ac2264f6c65f2f11cf79af0e2798e1b6ce5392ba96c1686b9005c4e9ba6c7de Oct 06 16:00:01 crc kubenswrapper[4763]: I1006 16:00:01.071534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" event={"ID":"cf38ef16-1123-4994-aea6-fcf552779957","Type":"ContainerStarted","Data":"2ac2264f6c65f2f11cf79af0e2798e1b6ce5392ba96c1686b9005c4e9ba6c7de"} Oct 06 16:00:02 crc kubenswrapper[4763]: I1006 16:00:02.082033 4763 generic.go:334] "Generic (PLEG): container finished" podID="cf38ef16-1123-4994-aea6-fcf552779957" containerID="40a757256c9198cd9bec8d3945ba649f98e41777aa43b0f6acbcac3a8e9455b8" exitCode=0 Oct 06 16:00:02 crc kubenswrapper[4763]: I1006 16:00:02.082085 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" event={"ID":"cf38ef16-1123-4994-aea6-fcf552779957","Type":"ContainerDied","Data":"40a757256c9198cd9bec8d3945ba649f98e41777aa43b0f6acbcac3a8e9455b8"} Oct 06 16:00:03 crc kubenswrapper[4763]: I1006 16:00:03.390497 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:03 crc kubenswrapper[4763]: I1006 16:00:03.409418 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf38ef16-1123-4994-aea6-fcf552779957-secret-volume\") pod \"cf38ef16-1123-4994-aea6-fcf552779957\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " Oct 06 16:00:03 crc kubenswrapper[4763]: I1006 16:00:03.409533 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf38ef16-1123-4994-aea6-fcf552779957-config-volume\") pod \"cf38ef16-1123-4994-aea6-fcf552779957\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " Oct 06 16:00:03 crc kubenswrapper[4763]: I1006 16:00:03.409660 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmrbp\" (UniqueName: \"kubernetes.io/projected/cf38ef16-1123-4994-aea6-fcf552779957-kube-api-access-wmrbp\") pod \"cf38ef16-1123-4994-aea6-fcf552779957\" (UID: \"cf38ef16-1123-4994-aea6-fcf552779957\") " Oct 06 16:00:03 crc kubenswrapper[4763]: I1006 16:00:03.412839 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf38ef16-1123-4994-aea6-fcf552779957-config-volume" (OuterVolumeSpecName: "config-volume") pod "cf38ef16-1123-4994-aea6-fcf552779957" (UID: "cf38ef16-1123-4994-aea6-fcf552779957"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:00:03 crc kubenswrapper[4763]: I1006 16:00:03.435345 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf38ef16-1123-4994-aea6-fcf552779957-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cf38ef16-1123-4994-aea6-fcf552779957" (UID: "cf38ef16-1123-4994-aea6-fcf552779957"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:00:03 crc kubenswrapper[4763]: I1006 16:00:03.436821 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf38ef16-1123-4994-aea6-fcf552779957-kube-api-access-wmrbp" (OuterVolumeSpecName: "kube-api-access-wmrbp") pod "cf38ef16-1123-4994-aea6-fcf552779957" (UID: "cf38ef16-1123-4994-aea6-fcf552779957"). InnerVolumeSpecName "kube-api-access-wmrbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:00:03 crc kubenswrapper[4763]: I1006 16:00:03.511770 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmrbp\" (UniqueName: \"kubernetes.io/projected/cf38ef16-1123-4994-aea6-fcf552779957-kube-api-access-wmrbp\") on node \"crc\" DevicePath \"\"" Oct 06 16:00:03 crc kubenswrapper[4763]: I1006 16:00:03.511826 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf38ef16-1123-4994-aea6-fcf552779957-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 16:00:03 crc kubenswrapper[4763]: I1006 16:00:03.511843 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf38ef16-1123-4994-aea6-fcf552779957-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 16:00:04 crc kubenswrapper[4763]: I1006 16:00:04.100151 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" event={"ID":"cf38ef16-1123-4994-aea6-fcf552779957","Type":"ContainerDied","Data":"2ac2264f6c65f2f11cf79af0e2798e1b6ce5392ba96c1686b9005c4e9ba6c7de"} Oct 06 16:00:04 crc kubenswrapper[4763]: I1006 16:00:04.100446 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ac2264f6c65f2f11cf79af0e2798e1b6ce5392ba96c1686b9005c4e9ba6c7de" Oct 06 16:00:04 crc kubenswrapper[4763]: I1006 16:00:04.100220 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck" Oct 06 16:00:04 crc kubenswrapper[4763]: I1006 16:00:04.480658 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr"] Oct 06 16:00:04 crc kubenswrapper[4763]: I1006 16:00:04.485360 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329395-lgkmr"] Oct 06 16:00:05 crc kubenswrapper[4763]: I1006 16:00:05.589301 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57c7e43-942b-418a-978c-87a2e535d430" path="/var/lib/kubelet/pods/e57c7e43-942b-418a-978c-87a2e535d430/volumes" Oct 06 16:00:14 crc kubenswrapper[4763]: I1006 16:00:14.575811 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:00:14 crc kubenswrapper[4763]: E1006 16:00:14.576768 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:00:26 crc kubenswrapper[4763]: I1006 16:00:26.576041 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:00:26 crc kubenswrapper[4763]: E1006 16:00:26.577347 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:00:28 crc kubenswrapper[4763]: I1006 16:00:28.494055 4763 scope.go:117] "RemoveContainer" containerID="1064598d9fd0ba5a4231a251144278663c79cbd4c6f961d209063c216fe15c4a" Oct 06 16:00:39 crc kubenswrapper[4763]: I1006 16:00:39.575593 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:00:39 crc kubenswrapper[4763]: E1006 16:00:39.576963 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:00:51 crc kubenswrapper[4763]: I1006 16:00:51.575280 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:00:51 crc kubenswrapper[4763]: E1006 16:00:51.576332 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:01:05 crc kubenswrapper[4763]: I1006 16:01:05.575488 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:01:05 crc kubenswrapper[4763]: E1006 16:01:05.576424 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:01:19 crc kubenswrapper[4763]: I1006 16:01:19.577611 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:01:19 crc kubenswrapper[4763]: E1006 16:01:19.578686 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:01:34 crc kubenswrapper[4763]: I1006 16:01:34.575077 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:01:34 crc kubenswrapper[4763]: E1006 16:01:34.576199 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:01:46 crc kubenswrapper[4763]: I1006 16:01:46.575509 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:01:46 crc kubenswrapper[4763]: E1006 16:01:46.576528 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:02:00 crc kubenswrapper[4763]: I1006 16:02:00.574906 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:02:00 crc kubenswrapper[4763]: E1006 16:02:00.575931 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:02:11 crc kubenswrapper[4763]: I1006 16:02:11.574764 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:02:11 crc kubenswrapper[4763]: E1006 16:02:11.575556 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:02:23 crc kubenswrapper[4763]: I1006 16:02:23.584376 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:02:23 crc kubenswrapper[4763]: E1006 16:02:23.585792 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:02:35 crc kubenswrapper[4763]: I1006 16:02:35.575013 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:02:36 crc kubenswrapper[4763]: I1006 16:02:36.402415 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"7d51a5f3afd59299ca772e66633eb85e7f25d92468a6eba3a50b479a8137af4a"} Oct 06 16:05:03 crc kubenswrapper[4763]: I1006 16:05:03.877437 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:05:03 crc kubenswrapper[4763]: I1006 16:05:03.878148 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:05:26 crc kubenswrapper[4763]: I1006 16:05:26.988590 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b7g48"] Oct 06 16:05:26 crc kubenswrapper[4763]: E1006 16:05:26.990270 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf38ef16-1123-4994-aea6-fcf552779957" containerName="collect-profiles" Oct 06 16:05:26 crc kubenswrapper[4763]: I1006 16:05:26.990290 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf38ef16-1123-4994-aea6-fcf552779957" containerName="collect-profiles" Oct 06 16:05:26 crc kubenswrapper[4763]: I1006 16:05:26.990515 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf38ef16-1123-4994-aea6-fcf552779957" containerName="collect-profiles" Oct 06 16:05:26 crc kubenswrapper[4763]: I1006 16:05:26.993346 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.034236 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b7g48"] Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.159633 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-catalog-content\") pod \"certified-operators-b7g48\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.159721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-utilities\") pod \"certified-operators-b7g48\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.160234 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27l6p\" (UniqueName: \"kubernetes.io/projected/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-kube-api-access-27l6p\") pod \"certified-operators-b7g48\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.261282 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-catalog-content\") pod \"certified-operators-b7g48\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.261397 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-utilities\") pod \"certified-operators-b7g48\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.261506 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27l6p\" (UniqueName: \"kubernetes.io/projected/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-kube-api-access-27l6p\") pod \"certified-operators-b7g48\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.262702 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-catalog-content\") pod \"certified-operators-b7g48\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.263268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-utilities\") pod \"certified-operators-b7g48\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.284649 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27l6p\" (UniqueName: \"kubernetes.io/projected/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-kube-api-access-27l6p\") pod \"certified-operators-b7g48\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.327203 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:27 crc kubenswrapper[4763]: I1006 16:05:27.816591 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b7g48"] Oct 06 16:05:27 crc kubenswrapper[4763]: W1006 16:05:27.822910 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf76e49de_d2fd_4593_a18c_bfe9b063f9cf.slice/crio-2408feab39345aeaa1583f81a38233ebf6654e730345077f6139e51b0e912377 WatchSource:0}: Error finding container 2408feab39345aeaa1583f81a38233ebf6654e730345077f6139e51b0e912377: Status 404 returned error can't find the container with id 2408feab39345aeaa1583f81a38233ebf6654e730345077f6139e51b0e912377 Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.783097 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pljvj"] Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.787422 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.801033 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pljvj"] Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.850272 4763 generic.go:334] "Generic (PLEG): container finished" podID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerID="03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06" exitCode=0 Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.850335 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7g48" event={"ID":"f76e49de-d2fd-4593-a18c-bfe9b063f9cf","Type":"ContainerDied","Data":"03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06"} Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.850397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7g48" event={"ID":"f76e49de-d2fd-4593-a18c-bfe9b063f9cf","Type":"ContainerStarted","Data":"2408feab39345aeaa1583f81a38233ebf6654e730345077f6139e51b0e912377"} Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.852681 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.888278 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-utilities\") pod \"community-operators-pljvj\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.889056 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d4wv\" (UniqueName: \"kubernetes.io/projected/0e6e753b-eddd-4840-9c13-cebc14f61092-kube-api-access-2d4wv\") pod \"community-operators-pljvj\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.889199 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-catalog-content\") pod \"community-operators-pljvj\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.991162 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-utilities\") pod \"community-operators-pljvj\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.991501 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d4wv\" (UniqueName: \"kubernetes.io/projected/0e6e753b-eddd-4840-9c13-cebc14f61092-kube-api-access-2d4wv\") pod \"community-operators-pljvj\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.991678 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-catalog-content\") pod \"community-operators-pljvj\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.991958 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-utilities\") pod \"community-operators-pljvj\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:28 crc kubenswrapper[4763]: I1006 16:05:28.992371 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-catalog-content\") pod \"community-operators-pljvj\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.022880 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d4wv\" (UniqueName: \"kubernetes.io/projected/0e6e753b-eddd-4840-9c13-cebc14f61092-kube-api-access-2d4wv\") pod \"community-operators-pljvj\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.136006 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.373470 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g9t2j"] Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.374899 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.389886 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9t2j"] Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.498117 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnxz8\" (UniqueName: \"kubernetes.io/projected/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-kube-api-access-qnxz8\") pod \"redhat-operators-g9t2j\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.498196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-catalog-content\") pod \"redhat-operators-g9t2j\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.498256 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-utilities\") pod \"redhat-operators-g9t2j\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.599471 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnxz8\" (UniqueName: \"kubernetes.io/projected/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-kube-api-access-qnxz8\") pod \"redhat-operators-g9t2j\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.599522 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-catalog-content\") pod \"redhat-operators-g9t2j\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.599551 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-utilities\") pod \"redhat-operators-g9t2j\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.600094 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-utilities\") pod \"redhat-operators-g9t2j\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.600373 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-catalog-content\") pod \"redhat-operators-g9t2j\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.623302 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnxz8\" (UniqueName: \"kubernetes.io/projected/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-kube-api-access-qnxz8\") pod \"redhat-operators-g9t2j\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.637714 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pljvj"] Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.695826 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:29 crc kubenswrapper[4763]: I1006 16:05:29.857812 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pljvj" event={"ID":"0e6e753b-eddd-4840-9c13-cebc14f61092","Type":"ContainerStarted","Data":"b3e661fdbb51741994318664e66be65ff81af988a3ad7d00e418e06531b296f6"} Oct 06 16:05:30 crc kubenswrapper[4763]: I1006 16:05:30.184726 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9t2j"] Oct 06 16:05:30 crc kubenswrapper[4763]: W1006 16:05:30.184762 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cccd13e_7c0f_4f6c_90ea_b7fe37fe23f4.slice/crio-0199c0bc29258b44352a1bad9966e31c97bdf9f1d852e56150290f318eb5d894 WatchSource:0}: Error finding container 0199c0bc29258b44352a1bad9966e31c97bdf9f1d852e56150290f318eb5d894: Status 404 returned error can't find the container with id 0199c0bc29258b44352a1bad9966e31c97bdf9f1d852e56150290f318eb5d894 Oct 06 16:05:30 crc kubenswrapper[4763]: I1006 16:05:30.883184 4763 generic.go:334] "Generic (PLEG): container finished" podID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerID="c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4" exitCode=0 Oct 06 16:05:30 crc kubenswrapper[4763]: I1006 16:05:30.883582 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7g48" event={"ID":"f76e49de-d2fd-4593-a18c-bfe9b063f9cf","Type":"ContainerDied","Data":"c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4"} Oct 06 16:05:30 crc kubenswrapper[4763]: I1006 16:05:30.886458 4763 generic.go:334] "Generic (PLEG): container finished" podID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerID="b2313b0e43b5e80eb463a7e05581a9aea96e8a49221d9403b338797b3280b898" exitCode=0 Oct 06 16:05:30 crc kubenswrapper[4763]: I1006 16:05:30.886537 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9t2j" event={"ID":"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4","Type":"ContainerDied","Data":"b2313b0e43b5e80eb463a7e05581a9aea96e8a49221d9403b338797b3280b898"} Oct 06 16:05:30 crc kubenswrapper[4763]: I1006 16:05:30.886565 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9t2j" event={"ID":"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4","Type":"ContainerStarted","Data":"0199c0bc29258b44352a1bad9966e31c97bdf9f1d852e56150290f318eb5d894"} Oct 06 16:05:30 crc kubenswrapper[4763]: I1006 16:05:30.890915 4763 generic.go:334] "Generic (PLEG): container finished" podID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerID="c1d784f2b5213cb7a84d0d48b4ec22220ebd6b0c33571d1cca9c70f32cce4ccf" exitCode=0 Oct 06 16:05:30 crc kubenswrapper[4763]: I1006 16:05:30.891038 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pljvj" event={"ID":"0e6e753b-eddd-4840-9c13-cebc14f61092","Type":"ContainerDied","Data":"c1d784f2b5213cb7a84d0d48b4ec22220ebd6b0c33571d1cca9c70f32cce4ccf"} Oct 06 16:05:31 crc kubenswrapper[4763]: I1006 16:05:31.901687 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pljvj" event={"ID":"0e6e753b-eddd-4840-9c13-cebc14f61092","Type":"ContainerStarted","Data":"c43e19bd9312cd63c757fe4519886996b6792b7c4374748991ed45a3493acf42"} Oct 06 16:05:31 crc kubenswrapper[4763]: I1006 16:05:31.906055 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9t2j" event={"ID":"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4","Type":"ContainerStarted","Data":"a8d30f77a30102db13e1ca9c82669c2a9df11d962d2673b15e7996f3e1cb3cd6"} Oct 06 16:05:32 crc kubenswrapper[4763]: I1006 16:05:32.916645 4763 generic.go:334] "Generic (PLEG): container finished" podID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerID="c43e19bd9312cd63c757fe4519886996b6792b7c4374748991ed45a3493acf42" exitCode=0 Oct 06 16:05:32 crc kubenswrapper[4763]: I1006 16:05:32.916692 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pljvj" event={"ID":"0e6e753b-eddd-4840-9c13-cebc14f61092","Type":"ContainerDied","Data":"c43e19bd9312cd63c757fe4519886996b6792b7c4374748991ed45a3493acf42"} Oct 06 16:05:32 crc kubenswrapper[4763]: I1006 16:05:32.920108 4763 generic.go:334] "Generic (PLEG): container finished" podID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerID="a8d30f77a30102db13e1ca9c82669c2a9df11d962d2673b15e7996f3e1cb3cd6" exitCode=0 Oct 06 16:05:32 crc kubenswrapper[4763]: I1006 16:05:32.920309 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9t2j" event={"ID":"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4","Type":"ContainerDied","Data":"a8d30f77a30102db13e1ca9c82669c2a9df11d962d2673b15e7996f3e1cb3cd6"} Oct 06 16:05:33 crc kubenswrapper[4763]: I1006 16:05:33.877553 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:05:33 crc kubenswrapper[4763]: I1006 16:05:33.877679 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:05:33 crc kubenswrapper[4763]: I1006 16:05:33.931729 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9t2j" event={"ID":"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4","Type":"ContainerStarted","Data":"30a92b367824960e5820b4d6c1a25c5c5e2844099cca0fbeb5e021ebcab011e2"} Oct 06 16:05:33 crc kubenswrapper[4763]: I1006 16:05:33.934190 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pljvj" event={"ID":"0e6e753b-eddd-4840-9c13-cebc14f61092","Type":"ContainerStarted","Data":"7d9b1e6acd66528f428cc3cb366f225513765fc5f8d3f2ce93c11d07eae688ec"} Oct 06 16:05:33 crc kubenswrapper[4763]: I1006 16:05:33.955330 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g9t2j" podStartSLOduration=2.46350559 podStartE2EDuration="4.955305992s" podCreationTimestamp="2025-10-06 16:05:29 +0000 UTC" firstStartedPulling="2025-10-06 16:05:30.888900804 +0000 UTC m=+4328.044193356" lastFinishedPulling="2025-10-06 16:05:33.380701246 +0000 UTC m=+4330.535993758" observedRunningTime="2025-10-06 16:05:33.947769097 +0000 UTC m=+4331.103061629" watchObservedRunningTime="2025-10-06 16:05:33.955305992 +0000 UTC m=+4331.110598504" Oct 06 16:05:33 crc kubenswrapper[4763]: I1006 16:05:33.966063 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pljvj" podStartSLOduration=3.521413085 podStartE2EDuration="5.966047414s" podCreationTimestamp="2025-10-06 16:05:28 +0000 UTC" firstStartedPulling="2025-10-06 16:05:30.897179749 +0000 UTC m=+4328.052472261" lastFinishedPulling="2025-10-06 16:05:33.341814078 +0000 UTC m=+4330.497106590" observedRunningTime="2025-10-06 16:05:33.964651146 +0000 UTC m=+4331.119943728" watchObservedRunningTime="2025-10-06 16:05:33.966047414 +0000 UTC m=+4331.121339926" Oct 06 16:05:35 crc kubenswrapper[4763]: I1006 16:05:35.956024 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7g48" event={"ID":"f76e49de-d2fd-4593-a18c-bfe9b063f9cf","Type":"ContainerStarted","Data":"cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e"} Oct 06 16:05:37 crc kubenswrapper[4763]: I1006 16:05:37.328191 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:37 crc kubenswrapper[4763]: I1006 16:05:37.328578 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:37 crc kubenswrapper[4763]: I1006 16:05:37.394421 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:37 crc kubenswrapper[4763]: I1006 16:05:37.419136 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b7g48" podStartSLOduration=4.7520384799999995 podStartE2EDuration="11.419117193s" podCreationTimestamp="2025-10-06 16:05:26 +0000 UTC" firstStartedPulling="2025-10-06 16:05:28.85236799 +0000 UTC m=+4326.007660512" lastFinishedPulling="2025-10-06 16:05:35.519446713 +0000 UTC m=+4332.674739225" observedRunningTime="2025-10-06 16:05:35.975346518 +0000 UTC m=+4333.130639030" watchObservedRunningTime="2025-10-06 16:05:37.419117193 +0000 UTC m=+4334.574409705" Oct 06 16:05:39 crc kubenswrapper[4763]: I1006 16:05:39.136565 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:39 crc kubenswrapper[4763]: I1006 16:05:39.137181 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:39 crc kubenswrapper[4763]: I1006 16:05:39.217157 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:39 crc kubenswrapper[4763]: I1006 16:05:39.696159 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:39 crc kubenswrapper[4763]: I1006 16:05:39.696231 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:39 crc kubenswrapper[4763]: I1006 16:05:39.751008 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:40 crc kubenswrapper[4763]: I1006 16:05:40.062006 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:40 crc kubenswrapper[4763]: I1006 16:05:40.078500 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:41 crc kubenswrapper[4763]: I1006 16:05:41.561646 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pljvj"] Oct 06 16:05:42 crc kubenswrapper[4763]: I1006 16:05:42.018982 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pljvj" podUID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerName="registry-server" containerID="cri-o://7d9b1e6acd66528f428cc3cb366f225513765fc5f8d3f2ce93c11d07eae688ec" gracePeriod=2 Oct 06 16:05:42 crc kubenswrapper[4763]: I1006 16:05:42.170174 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9t2j"] Oct 06 16:05:42 crc kubenswrapper[4763]: I1006 16:05:42.170505 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g9t2j" podUID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerName="registry-server" containerID="cri-o://30a92b367824960e5820b4d6c1a25c5c5e2844099cca0fbeb5e021ebcab011e2" gracePeriod=2 Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.044522 4763 generic.go:334] "Generic (PLEG): container finished" podID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerID="30a92b367824960e5820b4d6c1a25c5c5e2844099cca0fbeb5e021ebcab011e2" exitCode=0 Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.044580 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9t2j" event={"ID":"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4","Type":"ContainerDied","Data":"30a92b367824960e5820b4d6c1a25c5c5e2844099cca0fbeb5e021ebcab011e2"} Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.051973 4763 generic.go:334] "Generic (PLEG): container finished" podID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerID="7d9b1e6acd66528f428cc3cb366f225513765fc5f8d3f2ce93c11d07eae688ec" exitCode=0 Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.052031 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pljvj" event={"ID":"0e6e753b-eddd-4840-9c13-cebc14f61092","Type":"ContainerDied","Data":"7d9b1e6acd66528f428cc3cb366f225513765fc5f8d3f2ce93c11d07eae688ec"} Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.220702 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.323161 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wv\" (UniqueName: \"kubernetes.io/projected/0e6e753b-eddd-4840-9c13-cebc14f61092-kube-api-access-2d4wv\") pod \"0e6e753b-eddd-4840-9c13-cebc14f61092\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.323269 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-utilities\") pod \"0e6e753b-eddd-4840-9c13-cebc14f61092\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.323374 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-catalog-content\") pod \"0e6e753b-eddd-4840-9c13-cebc14f61092\" (UID: \"0e6e753b-eddd-4840-9c13-cebc14f61092\") " Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.324160 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-utilities" (OuterVolumeSpecName: "utilities") pod "0e6e753b-eddd-4840-9c13-cebc14f61092" (UID: "0e6e753b-eddd-4840-9c13-cebc14f61092"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.328719 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6e753b-eddd-4840-9c13-cebc14f61092-kube-api-access-2d4wv" (OuterVolumeSpecName: "kube-api-access-2d4wv") pod "0e6e753b-eddd-4840-9c13-cebc14f61092" (UID: "0e6e753b-eddd-4840-9c13-cebc14f61092"). InnerVolumeSpecName "kube-api-access-2d4wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.370888 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e6e753b-eddd-4840-9c13-cebc14f61092" (UID: "0e6e753b-eddd-4840-9c13-cebc14f61092"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.374126 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.424682 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wv\" (UniqueName: \"kubernetes.io/projected/0e6e753b-eddd-4840-9c13-cebc14f61092-kube-api-access-2d4wv\") on node \"crc\" DevicePath \"\"" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.424727 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.424739 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6e753b-eddd-4840-9c13-cebc14f61092-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.526195 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnxz8\" (UniqueName: \"kubernetes.io/projected/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-kube-api-access-qnxz8\") pod \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.526292 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-utilities\") pod \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.526388 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-catalog-content\") pod \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\" (UID: \"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4\") " Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.528939 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-utilities" (OuterVolumeSpecName: "utilities") pod "3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" (UID: "3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.529327 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-kube-api-access-qnxz8" (OuterVolumeSpecName: "kube-api-access-qnxz8") pod "3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" (UID: "3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4"). InnerVolumeSpecName "kube-api-access-qnxz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.620275 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" (UID: "3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.628226 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.628380 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:05:44 crc kubenswrapper[4763]: I1006 16:05:44.628495 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnxz8\" (UniqueName: \"kubernetes.io/projected/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4-kube-api-access-qnxz8\") on node \"crc\" DevicePath \"\"" Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.067605 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9t2j" event={"ID":"3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4","Type":"ContainerDied","Data":"0199c0bc29258b44352a1bad9966e31c97bdf9f1d852e56150290f318eb5d894"} Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.068709 4763 scope.go:117] "RemoveContainer" containerID="30a92b367824960e5820b4d6c1a25c5c5e2844099cca0fbeb5e021ebcab011e2" Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.067653 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9t2j" Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.069791 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pljvj" event={"ID":"0e6e753b-eddd-4840-9c13-cebc14f61092","Type":"ContainerDied","Data":"b3e661fdbb51741994318664e66be65ff81af988a3ad7d00e418e06531b296f6"} Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.069869 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pljvj" Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.129472 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pljvj"] Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.129862 4763 scope.go:117] "RemoveContainer" containerID="a8d30f77a30102db13e1ca9c82669c2a9df11d962d2673b15e7996f3e1cb3cd6" Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.138960 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pljvj"] Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.144882 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9t2j"] Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.150688 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g9t2j"] Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.154897 4763 scope.go:117] "RemoveContainer" containerID="b2313b0e43b5e80eb463a7e05581a9aea96e8a49221d9403b338797b3280b898" Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.196067 4763 scope.go:117] "RemoveContainer" containerID="7d9b1e6acd66528f428cc3cb366f225513765fc5f8d3f2ce93c11d07eae688ec" Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.220698 4763 scope.go:117] "RemoveContainer" containerID="c43e19bd9312cd63c757fe4519886996b6792b7c4374748991ed45a3493acf42" Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.244076 4763 scope.go:117] "RemoveContainer" containerID="c1d784f2b5213cb7a84d0d48b4ec22220ebd6b0c33571d1cca9c70f32cce4ccf" Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.587750 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e6e753b-eddd-4840-9c13-cebc14f61092" path="/var/lib/kubelet/pods/0e6e753b-eddd-4840-9c13-cebc14f61092/volumes" Oct 06 16:05:45 crc kubenswrapper[4763]: I1006 16:05:45.589051 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" path="/var/lib/kubelet/pods/3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4/volumes" Oct 06 16:05:47 crc kubenswrapper[4763]: I1006 16:05:47.370242 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:49 crc kubenswrapper[4763]: I1006 16:05:49.564428 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b7g48"] Oct 06 16:05:49 crc kubenswrapper[4763]: I1006 16:05:49.564874 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b7g48" podUID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerName="registry-server" containerID="cri-o://cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e" gracePeriod=2 Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.023373 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.102737 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-catalog-content\") pod \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.102800 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27l6p\" (UniqueName: \"kubernetes.io/projected/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-kube-api-access-27l6p\") pod \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.102825 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-utilities\") pod \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\" (UID: \"f76e49de-d2fd-4593-a18c-bfe9b063f9cf\") " Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.103928 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-utilities" (OuterVolumeSpecName: "utilities") pod "f76e49de-d2fd-4593-a18c-bfe9b063f9cf" (UID: "f76e49de-d2fd-4593-a18c-bfe9b063f9cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.108701 4763 generic.go:334] "Generic (PLEG): container finished" podID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerID="cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e" exitCode=0 Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.108753 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b7g48" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.109018 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-kube-api-access-27l6p" (OuterVolumeSpecName: "kube-api-access-27l6p") pod "f76e49de-d2fd-4593-a18c-bfe9b063f9cf" (UID: "f76e49de-d2fd-4593-a18c-bfe9b063f9cf"). InnerVolumeSpecName "kube-api-access-27l6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.108769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7g48" event={"ID":"f76e49de-d2fd-4593-a18c-bfe9b063f9cf","Type":"ContainerDied","Data":"cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e"} Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.109131 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7g48" event={"ID":"f76e49de-d2fd-4593-a18c-bfe9b063f9cf","Type":"ContainerDied","Data":"2408feab39345aeaa1583f81a38233ebf6654e730345077f6139e51b0e912377"} Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.109182 4763 scope.go:117] "RemoveContainer" containerID="cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.140469 4763 scope.go:117] "RemoveContainer" containerID="c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.164961 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f76e49de-d2fd-4593-a18c-bfe9b063f9cf" (UID: "f76e49de-d2fd-4593-a18c-bfe9b063f9cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.165900 4763 scope.go:117] "RemoveContainer" containerID="03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.184420 4763 scope.go:117] "RemoveContainer" containerID="cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e" Oct 06 16:05:50 crc kubenswrapper[4763]: E1006 16:05:50.184950 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e\": container with ID starting with cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e not found: ID does not exist" containerID="cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.184983 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e"} err="failed to get container status \"cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e\": rpc error: code = NotFound desc = could not find container \"cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e\": container with ID starting with cd5b517f7c43a9fdcd16f897960014cd88a5bb604db5323e3ba09b11458e131e not found: ID does not exist" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.185004 4763 scope.go:117] "RemoveContainer" containerID="c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4" Oct 06 16:05:50 crc kubenswrapper[4763]: E1006 16:05:50.185351 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4\": container with ID starting with c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4 not found: ID does not exist" containerID="c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.185388 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4"} err="failed to get container status \"c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4\": rpc error: code = NotFound desc = could not find container \"c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4\": container with ID starting with c9e39d17812926049d10e6101ab2d2bfea69837de2b46a29f983788865edfda4 not found: ID does not exist" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.185409 4763 scope.go:117] "RemoveContainer" containerID="03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06" Oct 06 16:05:50 crc kubenswrapper[4763]: E1006 16:05:50.185727 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06\": container with ID starting with 03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06 not found: ID does not exist" containerID="03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.185755 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06"} err="failed to get container status \"03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06\": rpc error: code = NotFound desc = could not find container \"03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06\": container with ID starting with 03f4f78fa428691f89914462cdb7c2f507d4682bf4eb083cf2da95fd2476dd06 not found: ID does not exist" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.204769 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.204809 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27l6p\" (UniqueName: \"kubernetes.io/projected/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-kube-api-access-27l6p\") on node \"crc\" DevicePath \"\"" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.204818 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76e49de-d2fd-4593-a18c-bfe9b063f9cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.443222 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b7g48"] Oct 06 16:05:50 crc kubenswrapper[4763]: I1006 16:05:50.447240 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b7g48"] Oct 06 16:05:51 crc kubenswrapper[4763]: I1006 16:05:51.586664 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" path="/var/lib/kubelet/pods/f76e49de-d2fd-4593-a18c-bfe9b063f9cf/volumes" Oct 06 16:06:03 crc kubenswrapper[4763]: I1006 16:06:03.876911 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:06:03 crc kubenswrapper[4763]: I1006 16:06:03.877493 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:06:03 crc kubenswrapper[4763]: I1006 16:06:03.877537 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 16:06:03 crc kubenswrapper[4763]: I1006 16:06:03.878143 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d51a5f3afd59299ca772e66633eb85e7f25d92468a6eba3a50b479a8137af4a"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 16:06:03 crc kubenswrapper[4763]: I1006 16:06:03.878264 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://7d51a5f3afd59299ca772e66633eb85e7f25d92468a6eba3a50b479a8137af4a" gracePeriod=600 Oct 06 16:06:04 crc kubenswrapper[4763]: I1006 16:06:04.229509 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="7d51a5f3afd59299ca772e66633eb85e7f25d92468a6eba3a50b479a8137af4a" exitCode=0 Oct 06 16:06:04 crc kubenswrapper[4763]: I1006 16:06:04.229596 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"7d51a5f3afd59299ca772e66633eb85e7f25d92468a6eba3a50b479a8137af4a"} Oct 06 16:06:04 crc kubenswrapper[4763]: I1006 16:06:04.229705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a"} Oct 06 16:06:04 crc kubenswrapper[4763]: I1006 16:06:04.229748 4763 scope.go:117] "RemoveContainer" containerID="9614ecb88afeb5171cf84affef862f127b03144f468b58285b9af913b32e704b" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.062611 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-b7qld"] Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.071282 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-b7qld"] Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153119 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-qv4td"] Oct 06 16:06:48 crc kubenswrapper[4763]: E1006 16:06:48.153438 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerName="registry-server" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153456 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerName="registry-server" Oct 06 16:06:48 crc kubenswrapper[4763]: E1006 16:06:48.153471 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerName="extract-utilities" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153479 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerName="extract-utilities" Oct 06 16:06:48 crc kubenswrapper[4763]: E1006 16:06:48.153493 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerName="extract-utilities" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153501 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerName="extract-utilities" Oct 06 16:06:48 crc kubenswrapper[4763]: E1006 16:06:48.153521 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerName="extract-utilities" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153527 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerName="extract-utilities" Oct 06 16:06:48 crc kubenswrapper[4763]: E1006 16:06:48.153537 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerName="registry-server" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153543 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerName="registry-server" Oct 06 16:06:48 crc kubenswrapper[4763]: E1006 16:06:48.153554 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerName="extract-content" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153561 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerName="extract-content" Oct 06 16:06:48 crc kubenswrapper[4763]: E1006 16:06:48.153569 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerName="extract-content" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153574 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerName="extract-content" Oct 06 16:06:48 crc kubenswrapper[4763]: E1006 16:06:48.153582 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerName="extract-content" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153587 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerName="extract-content" Oct 06 16:06:48 crc kubenswrapper[4763]: E1006 16:06:48.153597 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerName="registry-server" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153603 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerName="registry-server" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153834 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76e49de-d2fd-4593-a18c-bfe9b063f9cf" containerName="registry-server" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153852 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cccd13e-7c0f-4f6c-90ea-b7fe37fe23f4" containerName="registry-server" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.153874 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6e753b-eddd-4840-9c13-cebc14f61092" containerName="registry-server" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.154365 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.157071 4763 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6jdbs" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.157278 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.157316 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.158544 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.164378 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qv4td"] Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.187001 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8db7e74e-229e-41da-9c7c-4b0cac6f5642-crc-storage\") pod \"crc-storage-crc-qv4td\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.187046 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8db7e74e-229e-41da-9c7c-4b0cac6f5642-node-mnt\") pod \"crc-storage-crc-qv4td\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.187266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7tl5\" (UniqueName: \"kubernetes.io/projected/8db7e74e-229e-41da-9c7c-4b0cac6f5642-kube-api-access-f7tl5\") pod \"crc-storage-crc-qv4td\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.288264 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8db7e74e-229e-41da-9c7c-4b0cac6f5642-crc-storage\") pod \"crc-storage-crc-qv4td\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.288330 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8db7e74e-229e-41da-9c7c-4b0cac6f5642-node-mnt\") pod \"crc-storage-crc-qv4td\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.288372 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7tl5\" (UniqueName: \"kubernetes.io/projected/8db7e74e-229e-41da-9c7c-4b0cac6f5642-kube-api-access-f7tl5\") pod \"crc-storage-crc-qv4td\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.289230 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8db7e74e-229e-41da-9c7c-4b0cac6f5642-crc-storage\") pod \"crc-storage-crc-qv4td\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.289226 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8db7e74e-229e-41da-9c7c-4b0cac6f5642-node-mnt\") pod \"crc-storage-crc-qv4td\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.313689 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7tl5\" (UniqueName: \"kubernetes.io/projected/8db7e74e-229e-41da-9c7c-4b0cac6f5642-kube-api-access-f7tl5\") pod \"crc-storage-crc-qv4td\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.474989 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:48 crc kubenswrapper[4763]: I1006 16:06:48.909090 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qv4td"] Oct 06 16:06:49 crc kubenswrapper[4763]: I1006 16:06:49.585277 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ccb8f11-9845-413f-b1a5-3c3b73e8fc33" path="/var/lib/kubelet/pods/8ccb8f11-9845-413f-b1a5-3c3b73e8fc33/volumes" Oct 06 16:06:49 crc kubenswrapper[4763]: I1006 16:06:49.655443 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qv4td" event={"ID":"8db7e74e-229e-41da-9c7c-4b0cac6f5642","Type":"ContainerStarted","Data":"c7454c4ae819ae52781e4ff1843b92ae78d473eac186896bfff160dbbdd27c4b"} Oct 06 16:06:50 crc kubenswrapper[4763]: I1006 16:06:50.664687 4763 generic.go:334] "Generic (PLEG): container finished" podID="8db7e74e-229e-41da-9c7c-4b0cac6f5642" containerID="866c5b5e21dcca95359e07dfea280f160fe712da6cecf10826d19fd6738a6a45" exitCode=0 Oct 06 16:06:50 crc kubenswrapper[4763]: I1006 16:06:50.664755 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qv4td" event={"ID":"8db7e74e-229e-41da-9c7c-4b0cac6f5642","Type":"ContainerDied","Data":"866c5b5e21dcca95359e07dfea280f160fe712da6cecf10826d19fd6738a6a45"} Oct 06 16:06:51 crc kubenswrapper[4763]: I1006 16:06:51.987905 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.140527 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8db7e74e-229e-41da-9c7c-4b0cac6f5642-node-mnt\") pod \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.140657 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8db7e74e-229e-41da-9c7c-4b0cac6f5642-crc-storage\") pod \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.140665 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8db7e74e-229e-41da-9c7c-4b0cac6f5642-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "8db7e74e-229e-41da-9c7c-4b0cac6f5642" (UID: "8db7e74e-229e-41da-9c7c-4b0cac6f5642"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.140690 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7tl5\" (UniqueName: \"kubernetes.io/projected/8db7e74e-229e-41da-9c7c-4b0cac6f5642-kube-api-access-f7tl5\") pod \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\" (UID: \"8db7e74e-229e-41da-9c7c-4b0cac6f5642\") " Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.140977 4763 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8db7e74e-229e-41da-9c7c-4b0cac6f5642-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.147014 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db7e74e-229e-41da-9c7c-4b0cac6f5642-kube-api-access-f7tl5" (OuterVolumeSpecName: "kube-api-access-f7tl5") pod "8db7e74e-229e-41da-9c7c-4b0cac6f5642" (UID: "8db7e74e-229e-41da-9c7c-4b0cac6f5642"). InnerVolumeSpecName "kube-api-access-f7tl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.160008 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db7e74e-229e-41da-9c7c-4b0cac6f5642-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "8db7e74e-229e-41da-9c7c-4b0cac6f5642" (UID: "8db7e74e-229e-41da-9c7c-4b0cac6f5642"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.242489 4763 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8db7e74e-229e-41da-9c7c-4b0cac6f5642-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.242519 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7tl5\" (UniqueName: \"kubernetes.io/projected/8db7e74e-229e-41da-9c7c-4b0cac6f5642-kube-api-access-f7tl5\") on node \"crc\" DevicePath \"\"" Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.686588 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qv4td" event={"ID":"8db7e74e-229e-41da-9c7c-4b0cac6f5642","Type":"ContainerDied","Data":"c7454c4ae819ae52781e4ff1843b92ae78d473eac186896bfff160dbbdd27c4b"} Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.686656 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7454c4ae819ae52781e4ff1843b92ae78d473eac186896bfff160dbbdd27c4b" Oct 06 16:06:52 crc kubenswrapper[4763]: I1006 16:06:52.686764 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qv4td" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.365129 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-qv4td"] Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.373912 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-qv4td"] Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.537529 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-6rvf6"] Oct 06 16:06:54 crc kubenswrapper[4763]: E1006 16:06:54.538276 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db7e74e-229e-41da-9c7c-4b0cac6f5642" containerName="storage" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.538307 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db7e74e-229e-41da-9c7c-4b0cac6f5642" containerName="storage" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.538591 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db7e74e-229e-41da-9c7c-4b0cac6f5642" containerName="storage" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.539549 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.542162 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.542205 4763 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6jdbs" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.542306 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.543287 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.547533 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6rvf6"] Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.586432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-crc-storage\") pod \"crc-storage-crc-6rvf6\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.586502 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcxb\" (UniqueName: \"kubernetes.io/projected/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-kube-api-access-jrcxb\") pod \"crc-storage-crc-6rvf6\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.586531 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-node-mnt\") pod \"crc-storage-crc-6rvf6\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.687676 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-node-mnt\") pod \"crc-storage-crc-6rvf6\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.687868 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-crc-storage\") pod \"crc-storage-crc-6rvf6\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.687950 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcxb\" (UniqueName: \"kubernetes.io/projected/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-kube-api-access-jrcxb\") pod \"crc-storage-crc-6rvf6\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.688574 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-node-mnt\") pod \"crc-storage-crc-6rvf6\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.688971 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-crc-storage\") pod \"crc-storage-crc-6rvf6\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.712840 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcxb\" (UniqueName: \"kubernetes.io/projected/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-kube-api-access-jrcxb\") pod \"crc-storage-crc-6rvf6\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:54 crc kubenswrapper[4763]: I1006 16:06:54.860053 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:55 crc kubenswrapper[4763]: I1006 16:06:55.390301 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6rvf6"] Oct 06 16:06:55 crc kubenswrapper[4763]: I1006 16:06:55.586017 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db7e74e-229e-41da-9c7c-4b0cac6f5642" path="/var/lib/kubelet/pods/8db7e74e-229e-41da-9c7c-4b0cac6f5642/volumes" Oct 06 16:06:55 crc kubenswrapper[4763]: I1006 16:06:55.712104 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6rvf6" event={"ID":"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa","Type":"ContainerStarted","Data":"2107e98b954ac862ecd327e53c7202cc21d4b5b2be41299129b55bc7e3a31cf2"} Oct 06 16:06:56 crc kubenswrapper[4763]: I1006 16:06:56.723109 4763 generic.go:334] "Generic (PLEG): container finished" podID="c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa" containerID="ef38378b0b34c3d4ddcbc34470915c2aea2d816a22bc146a9185d0161e40fb64" exitCode=0 Oct 06 16:06:56 crc kubenswrapper[4763]: I1006 16:06:56.723223 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6rvf6" event={"ID":"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa","Type":"ContainerDied","Data":"ef38378b0b34c3d4ddcbc34470915c2aea2d816a22bc146a9185d0161e40fb64"} Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.081985 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.231454 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-node-mnt\") pod \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.231590 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrcxb\" (UniqueName: \"kubernetes.io/projected/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-kube-api-access-jrcxb\") pod \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.231682 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa" (UID: "c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.231835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-crc-storage\") pod \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\" (UID: \"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa\") " Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.232961 4763 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.236493 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-kube-api-access-jrcxb" (OuterVolumeSpecName: "kube-api-access-jrcxb") pod "c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa" (UID: "c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa"). InnerVolumeSpecName "kube-api-access-jrcxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.262034 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa" (UID: "c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.333799 4763 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.333835 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrcxb\" (UniqueName: \"kubernetes.io/projected/c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa-kube-api-access-jrcxb\") on node \"crc\" DevicePath \"\"" Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.740453 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6rvf6" event={"ID":"c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa","Type":"ContainerDied","Data":"2107e98b954ac862ecd327e53c7202cc21d4b5b2be41299129b55bc7e3a31cf2"} Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.740491 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6rvf6" Oct 06 16:06:58 crc kubenswrapper[4763]: I1006 16:06:58.740512 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2107e98b954ac862ecd327e53c7202cc21d4b5b2be41299129b55bc7e3a31cf2" Oct 06 16:06:58 crc kubenswrapper[4763]: E1006 16:06:58.814931 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc770e86f_ebd3_4f9d_a8c8_6ad648c5feaa.slice/crio-2107e98b954ac862ecd327e53c7202cc21d4b5b2be41299129b55bc7e3a31cf2\": RecentStats: unable to find data in memory cache]" Oct 06 16:07:28 crc kubenswrapper[4763]: I1006 16:07:28.706799 4763 scope.go:117] "RemoveContainer" containerID="4608cf4e7028faeeb3a1485936e706f8467c82a40c7c5826204521276501b127" Oct 06 16:08:03 crc kubenswrapper[4763]: I1006 16:08:03.884858 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5rchl"] Oct 06 16:08:03 crc kubenswrapper[4763]: E1006 16:08:03.886021 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa" containerName="storage" Oct 06 16:08:03 crc kubenswrapper[4763]: I1006 16:08:03.886043 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa" containerName="storage" Oct 06 16:08:03 crc kubenswrapper[4763]: I1006 16:08:03.886385 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c770e86f-ebd3-4f9d-a8c8-6ad648c5feaa" containerName="storage" Oct 06 16:08:03 crc kubenswrapper[4763]: I1006 16:08:03.888460 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:03 crc kubenswrapper[4763]: I1006 16:08:03.910994 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rchl"] Oct 06 16:08:04 crc kubenswrapper[4763]: I1006 16:08:04.011369 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvr6t\" (UniqueName: \"kubernetes.io/projected/da99d17c-22a9-4d3a-a556-939adbdf09c2-kube-api-access-dvr6t\") pod \"redhat-marketplace-5rchl\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:04 crc kubenswrapper[4763]: I1006 16:08:04.011867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-catalog-content\") pod \"redhat-marketplace-5rchl\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:04 crc kubenswrapper[4763]: I1006 16:08:04.011957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-utilities\") pod \"redhat-marketplace-5rchl\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:04 crc kubenswrapper[4763]: I1006 16:08:04.113476 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-catalog-content\") pod \"redhat-marketplace-5rchl\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:04 crc kubenswrapper[4763]: I1006 16:08:04.113584 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-utilities\") pod \"redhat-marketplace-5rchl\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:04 crc kubenswrapper[4763]: I1006 16:08:04.113645 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvr6t\" (UniqueName: \"kubernetes.io/projected/da99d17c-22a9-4d3a-a556-939adbdf09c2-kube-api-access-dvr6t\") pod \"redhat-marketplace-5rchl\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:04 crc kubenswrapper[4763]: I1006 16:08:04.114411 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-utilities\") pod \"redhat-marketplace-5rchl\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:04 crc kubenswrapper[4763]: I1006 16:08:04.114599 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-catalog-content\") pod \"redhat-marketplace-5rchl\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:04 crc kubenswrapper[4763]: I1006 16:08:04.371697 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvr6t\" (UniqueName: \"kubernetes.io/projected/da99d17c-22a9-4d3a-a556-939adbdf09c2-kube-api-access-dvr6t\") pod \"redhat-marketplace-5rchl\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:04 crc kubenswrapper[4763]: I1006 16:08:04.521942 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:05 crc kubenswrapper[4763]: I1006 16:08:05.007284 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rchl"] Oct 06 16:08:05 crc kubenswrapper[4763]: W1006 16:08:05.009130 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda99d17c_22a9_4d3a_a556_939adbdf09c2.slice/crio-d8c0e791ca0496a772c81665af855679ce0c1bb576b8b72741d40655cd503304 WatchSource:0}: Error finding container d8c0e791ca0496a772c81665af855679ce0c1bb576b8b72741d40655cd503304: Status 404 returned error can't find the container with id d8c0e791ca0496a772c81665af855679ce0c1bb576b8b72741d40655cd503304 Oct 06 16:08:05 crc kubenswrapper[4763]: I1006 16:08:05.343269 4763 generic.go:334] "Generic (PLEG): container finished" podID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerID="c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467" exitCode=0 Oct 06 16:08:05 crc kubenswrapper[4763]: I1006 16:08:05.343339 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rchl" event={"ID":"da99d17c-22a9-4d3a-a556-939adbdf09c2","Type":"ContainerDied","Data":"c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467"} Oct 06 16:08:05 crc kubenswrapper[4763]: I1006 16:08:05.343557 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rchl" event={"ID":"da99d17c-22a9-4d3a-a556-939adbdf09c2","Type":"ContainerStarted","Data":"d8c0e791ca0496a772c81665af855679ce0c1bb576b8b72741d40655cd503304"} Oct 06 16:08:06 crc kubenswrapper[4763]: I1006 16:08:06.355322 4763 generic.go:334] "Generic (PLEG): container finished" podID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerID="a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8" exitCode=0 Oct 06 16:08:06 crc kubenswrapper[4763]: I1006 16:08:06.355376 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rchl" event={"ID":"da99d17c-22a9-4d3a-a556-939adbdf09c2","Type":"ContainerDied","Data":"a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8"} Oct 06 16:08:07 crc kubenswrapper[4763]: I1006 16:08:07.366874 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rchl" event={"ID":"da99d17c-22a9-4d3a-a556-939adbdf09c2","Type":"ContainerStarted","Data":"006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87"} Oct 06 16:08:07 crc kubenswrapper[4763]: I1006 16:08:07.385449 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5rchl" podStartSLOduration=2.685980801 podStartE2EDuration="4.385423663s" podCreationTimestamp="2025-10-06 16:08:03 +0000 UTC" firstStartedPulling="2025-10-06 16:08:05.344778107 +0000 UTC m=+4482.500070619" lastFinishedPulling="2025-10-06 16:08:07.044220929 +0000 UTC m=+4484.199513481" observedRunningTime="2025-10-06 16:08:07.382648818 +0000 UTC m=+4484.537941350" watchObservedRunningTime="2025-10-06 16:08:07.385423663 +0000 UTC m=+4484.540716195" Oct 06 16:08:14 crc kubenswrapper[4763]: I1006 16:08:14.523077 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:14 crc kubenswrapper[4763]: I1006 16:08:14.523532 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:14 crc kubenswrapper[4763]: I1006 16:08:14.590037 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:15 crc kubenswrapper[4763]: I1006 16:08:15.512723 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:15 crc kubenswrapper[4763]: I1006 16:08:15.591115 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rchl"] Oct 06 16:08:17 crc kubenswrapper[4763]: I1006 16:08:17.452990 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5rchl" podUID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerName="registry-server" containerID="cri-o://006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87" gracePeriod=2 Oct 06 16:08:17 crc kubenswrapper[4763]: I1006 16:08:17.868213 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:17 crc kubenswrapper[4763]: I1006 16:08:17.920775 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-catalog-content\") pod \"da99d17c-22a9-4d3a-a556-939adbdf09c2\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " Oct 06 16:08:17 crc kubenswrapper[4763]: I1006 16:08:17.920847 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-utilities\") pod \"da99d17c-22a9-4d3a-a556-939adbdf09c2\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " Oct 06 16:08:17 crc kubenswrapper[4763]: I1006 16:08:17.920948 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvr6t\" (UniqueName: \"kubernetes.io/projected/da99d17c-22a9-4d3a-a556-939adbdf09c2-kube-api-access-dvr6t\") pod \"da99d17c-22a9-4d3a-a556-939adbdf09c2\" (UID: \"da99d17c-22a9-4d3a-a556-939adbdf09c2\") " Oct 06 16:08:17 crc kubenswrapper[4763]: I1006 16:08:17.921650 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-utilities" (OuterVolumeSpecName: "utilities") pod "da99d17c-22a9-4d3a-a556-939adbdf09c2" (UID: "da99d17c-22a9-4d3a-a556-939adbdf09c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:08:17 crc kubenswrapper[4763]: I1006 16:08:17.930893 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da99d17c-22a9-4d3a-a556-939adbdf09c2-kube-api-access-dvr6t" (OuterVolumeSpecName: "kube-api-access-dvr6t") pod "da99d17c-22a9-4d3a-a556-939adbdf09c2" (UID: "da99d17c-22a9-4d3a-a556-939adbdf09c2"). InnerVolumeSpecName "kube-api-access-dvr6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:08:17 crc kubenswrapper[4763]: I1006 16:08:17.933102 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da99d17c-22a9-4d3a-a556-939adbdf09c2" (UID: "da99d17c-22a9-4d3a-a556-939adbdf09c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.022861 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvr6t\" (UniqueName: \"kubernetes.io/projected/da99d17c-22a9-4d3a-a556-939adbdf09c2-kube-api-access-dvr6t\") on node \"crc\" DevicePath \"\"" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.022904 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.022918 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da99d17c-22a9-4d3a-a556-939adbdf09c2-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.474739 4763 generic.go:334] "Generic (PLEG): container finished" podID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerID="006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87" exitCode=0 Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.474790 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rchl" event={"ID":"da99d17c-22a9-4d3a-a556-939adbdf09c2","Type":"ContainerDied","Data":"006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87"} Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.474820 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rchl" event={"ID":"da99d17c-22a9-4d3a-a556-939adbdf09c2","Type":"ContainerDied","Data":"d8c0e791ca0496a772c81665af855679ce0c1bb576b8b72741d40655cd503304"} Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.474836 4763 scope.go:117] "RemoveContainer" containerID="006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.474964 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rchl" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.520684 4763 scope.go:117] "RemoveContainer" containerID="a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.525438 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rchl"] Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.533558 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rchl"] Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.550869 4763 scope.go:117] "RemoveContainer" containerID="c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.584703 4763 scope.go:117] "RemoveContainer" containerID="006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87" Oct 06 16:08:18 crc kubenswrapper[4763]: E1006 16:08:18.585121 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87\": container with ID starting with 006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87 not found: ID does not exist" containerID="006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.585159 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87"} err="failed to get container status \"006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87\": rpc error: code = NotFound desc = could not find container \"006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87\": container with ID starting with 006cb451d4579300d205ed0c02aa75f70440b675ac374787b02a20db86af2b87 not found: ID does not exist" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.585184 4763 scope.go:117] "RemoveContainer" containerID="a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8" Oct 06 16:08:18 crc kubenswrapper[4763]: E1006 16:08:18.585877 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8\": container with ID starting with a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8 not found: ID does not exist" containerID="a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.585920 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8"} err="failed to get container status \"a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8\": rpc error: code = NotFound desc = could not find container \"a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8\": container with ID starting with a3e0b2cea7f2aac71ee520fded47ffd49b3cb74b5b866a2afd6428458af74ff8 not found: ID does not exist" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.585949 4763 scope.go:117] "RemoveContainer" containerID="c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467" Oct 06 16:08:18 crc kubenswrapper[4763]: E1006 16:08:18.586317 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467\": container with ID starting with c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467 not found: ID does not exist" containerID="c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467" Oct 06 16:08:18 crc kubenswrapper[4763]: I1006 16:08:18.586345 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467"} err="failed to get container status \"c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467\": rpc error: code = NotFound desc = could not find container \"c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467\": container with ID starting with c9873659ac50f46d62207629190a3768570316973a9206ebb66ba1aaf2f19467 not found: ID does not exist" Oct 06 16:08:19 crc kubenswrapper[4763]: I1006 16:08:19.584079 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da99d17c-22a9-4d3a-a556-939adbdf09c2" path="/var/lib/kubelet/pods/da99d17c-22a9-4d3a-a556-939adbdf09c2/volumes" Oct 06 16:08:33 crc kubenswrapper[4763]: I1006 16:08:33.877681 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:08:33 crc kubenswrapper[4763]: I1006 16:08:33.878204 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:09:03 crc kubenswrapper[4763]: I1006 16:09:03.876534 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:09:03 crc kubenswrapper[4763]: I1006 16:09:03.877219 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:09:33 crc kubenswrapper[4763]: I1006 16:09:33.877203 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:09:33 crc kubenswrapper[4763]: I1006 16:09:33.877947 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:09:33 crc kubenswrapper[4763]: I1006 16:09:33.878098 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 16:09:33 crc kubenswrapper[4763]: I1006 16:09:33.879087 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 16:09:33 crc kubenswrapper[4763]: I1006 16:09:33.879210 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" gracePeriod=600 Oct 06 16:09:34 crc kubenswrapper[4763]: E1006 16:09:34.010542 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:09:34 crc kubenswrapper[4763]: I1006 16:09:34.196087 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" exitCode=0 Oct 06 16:09:34 crc kubenswrapper[4763]: I1006 16:09:34.196140 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a"} Oct 06 16:09:34 crc kubenswrapper[4763]: I1006 16:09:34.196177 4763 scope.go:117] "RemoveContainer" containerID="7d51a5f3afd59299ca772e66633eb85e7f25d92468a6eba3a50b479a8137af4a" Oct 06 16:09:34 crc kubenswrapper[4763]: I1006 16:09:34.196950 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:09:34 crc kubenswrapper[4763]: E1006 16:09:34.199135 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:09:45 crc kubenswrapper[4763]: I1006 16:09:45.576024 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:09:45 crc kubenswrapper[4763]: E1006 16:09:45.577265 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:09:56 crc kubenswrapper[4763]: I1006 16:09:56.574866 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:09:56 crc kubenswrapper[4763]: E1006 16:09:56.576008 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.709056 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lrqmk"] Oct 06 16:10:02 crc kubenswrapper[4763]: E1006 16:10:02.709737 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerName="registry-server" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.709750 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerName="registry-server" Oct 06 16:10:02 crc kubenswrapper[4763]: E1006 16:10:02.709771 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerName="extract-utilities" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.709777 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerName="extract-utilities" Oct 06 16:10:02 crc kubenswrapper[4763]: E1006 16:10:02.709788 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerName="extract-content" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.709796 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerName="extract-content" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.709938 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da99d17c-22a9-4d3a-a556-939adbdf09c2" containerName="registry-server" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.710607 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.722986 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-lrqmk\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.723049 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-config\") pod \"dnsmasq-dns-5d7b5456f5-lrqmk\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.723124 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtwn4\" (UniqueName: \"kubernetes.io/projected/7515f331-e24e-4726-a24d-fe1a67cdee83-kube-api-access-qtwn4\") pod \"dnsmasq-dns-5d7b5456f5-lrqmk\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.737696 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.737783 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.737825 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.737871 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-g6nqk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.737978 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.743722 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lrqmk"] Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.824451 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtwn4\" (UniqueName: \"kubernetes.io/projected/7515f331-e24e-4726-a24d-fe1a67cdee83-kube-api-access-qtwn4\") pod \"dnsmasq-dns-5d7b5456f5-lrqmk\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.824892 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-lrqmk\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.824921 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-config\") pod \"dnsmasq-dns-5d7b5456f5-lrqmk\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.825906 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-lrqmk\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.825945 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-config\") pod \"dnsmasq-dns-5d7b5456f5-lrqmk\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.854995 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtwn4\" (UniqueName: \"kubernetes.io/projected/7515f331-e24e-4726-a24d-fe1a67cdee83-kube-api-access-qtwn4\") pod \"dnsmasq-dns-5d7b5456f5-lrqmk\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.880603 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-pt8b7"] Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.886429 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:02 crc kubenswrapper[4763]: I1006 16:10:02.892025 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-pt8b7"] Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.027717 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-pt8b7\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.027801 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-config\") pod \"dnsmasq-dns-98ddfc8f-pt8b7\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.027833 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljsbb\" (UniqueName: \"kubernetes.io/projected/b1f698bb-55ff-4199-afcd-b7d16ac14754-kube-api-access-ljsbb\") pod \"dnsmasq-dns-98ddfc8f-pt8b7\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.063172 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.129049 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-pt8b7\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.129093 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-config\") pod \"dnsmasq-dns-98ddfc8f-pt8b7\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.129118 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljsbb\" (UniqueName: \"kubernetes.io/projected/b1f698bb-55ff-4199-afcd-b7d16ac14754-kube-api-access-ljsbb\") pod \"dnsmasq-dns-98ddfc8f-pt8b7\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.130156 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-pt8b7\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.131111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-config\") pod \"dnsmasq-dns-98ddfc8f-pt8b7\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.146525 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljsbb\" (UniqueName: \"kubernetes.io/projected/b1f698bb-55ff-4199-afcd-b7d16ac14754-kube-api-access-ljsbb\") pod \"dnsmasq-dns-98ddfc8f-pt8b7\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.206883 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.505307 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lrqmk"] Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.657293 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-pt8b7"] Oct 06 16:10:03 crc kubenswrapper[4763]: W1006 16:10:03.661995 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1f698bb_55ff_4199_afcd_b7d16ac14754.slice/crio-6f17e680e09a6203fb10910a293ea0805ac370109c6da283cc0ee8c8553595f8 WatchSource:0}: Error finding container 6f17e680e09a6203fb10910a293ea0805ac370109c6da283cc0ee8c8553595f8: Status 404 returned error can't find the container with id 6f17e680e09a6203fb10910a293ea0805ac370109c6da283cc0ee8c8553595f8 Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.762813 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.763970 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.769101 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.769325 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.770239 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.770491 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.770542 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qvrql" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.775690 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.940436 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.940761 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.940847 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.940884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.940917 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs6sg\" (UniqueName: \"kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-kube-api-access-gs6sg\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.940985 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.941099 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.941116 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:03 crc kubenswrapper[4763]: I1006 16:10:03.941138 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.030110 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.032563 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.036769 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.036914 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.038051 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-g8djm" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.038160 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.038747 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.042369 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.042409 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.042433 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.042487 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.042506 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.042525 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.042543 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.042560 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs6sg\" (UniqueName: \"kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-kube-api-access-gs6sg\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.042588 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.043073 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.043281 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.043675 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.044110 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.044392 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.047663 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.047725 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/79df105821da558ff8a5667c6a1c58c7bd3de808e4eaa73f23cfde43a2c50242/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.047899 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.048106 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.056208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.065585 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs6sg\" (UniqueName: \"kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-kube-api-access-gs6sg\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.082370 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") pod \"rabbitmq-server-0\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.092298 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.144318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.144382 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.144418 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.144442 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ksf9\" (UniqueName: \"kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-kube-api-access-9ksf9\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.144510 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.144543 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.144577 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.144600 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.144646 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.245574 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.245884 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.245938 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.245962 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.245983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.245996 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksf9\" (UniqueName: \"kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-kube-api-access-9ksf9\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.246044 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.246065 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.246091 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.246438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.246534 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.247226 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.250320 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.251963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.252081 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.263307 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.264642 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.264680 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a9f1abfb4440df30f4c33d4ec8b540cf45ddbcc43283db36e58f92db789df22e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.267271 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ksf9\" (UniqueName: \"kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-kube-api-access-9ksf9\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.287103 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") pod \"rabbitmq-cell1-server-0\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.408365 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.440677 4763 generic.go:334] "Generic (PLEG): container finished" podID="b1f698bb-55ff-4199-afcd-b7d16ac14754" containerID="120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5" exitCode=0 Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.440764 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" event={"ID":"b1f698bb-55ff-4199-afcd-b7d16ac14754","Type":"ContainerDied","Data":"120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5"} Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.440794 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" event={"ID":"b1f698bb-55ff-4199-afcd-b7d16ac14754","Type":"ContainerStarted","Data":"6f17e680e09a6203fb10910a293ea0805ac370109c6da283cc0ee8c8553595f8"} Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.442135 4763 generic.go:334] "Generic (PLEG): container finished" podID="7515f331-e24e-4726-a24d-fe1a67cdee83" containerID="3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727" exitCode=0 Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.442175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" event={"ID":"7515f331-e24e-4726-a24d-fe1a67cdee83","Type":"ContainerDied","Data":"3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727"} Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.442197 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" event={"ID":"7515f331-e24e-4726-a24d-fe1a67cdee83","Type":"ContainerStarted","Data":"67840a1e015c8a84bd21b94ab46aa9191a4a3738d2d583c3a4fd85e7bf0b6f73"} Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.511289 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 16:10:04 crc kubenswrapper[4763]: W1006 16:10:04.671667 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aff9dac_999e_4ba7_84fd_4c7bf4383a0c.slice/crio-2eab94587d230d421bd9d659200fb13a5f2b70d58bcdf159c6605bba040fa601 WatchSource:0}: Error finding container 2eab94587d230d421bd9d659200fb13a5f2b70d58bcdf159c6605bba040fa601: Status 404 returned error can't find the container with id 2eab94587d230d421bd9d659200fb13a5f2b70d58bcdf159c6605bba040fa601 Oct 06 16:10:04 crc kubenswrapper[4763]: I1006 16:10:04.886244 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 16:10:04 crc kubenswrapper[4763]: W1006 16:10:04.894581 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0a5a7fc_7e4b_4b0e_a83e_ba5e77659fb6.slice/crio-101b794579928acb128b24c8a0c6abcda6bf21657adbc9e53d3d78b66567b53f WatchSource:0}: Error finding container 101b794579928acb128b24c8a0c6abcda6bf21657adbc9e53d3d78b66567b53f: Status 404 returned error can't find the container with id 101b794579928acb128b24c8a0c6abcda6bf21657adbc9e53d3d78b66567b53f Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.467090 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" event={"ID":"7515f331-e24e-4726-a24d-fe1a67cdee83","Type":"ContainerStarted","Data":"3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61"} Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.467449 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.468981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" event={"ID":"b1f698bb-55ff-4199-afcd-b7d16ac14754","Type":"ContainerStarted","Data":"0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1"} Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.469113 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.470231 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c","Type":"ContainerStarted","Data":"2eab94587d230d421bd9d659200fb13a5f2b70d58bcdf159c6605bba040fa601"} Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.475189 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6","Type":"ContainerStarted","Data":"101b794579928acb128b24c8a0c6abcda6bf21657adbc9e53d3d78b66567b53f"} Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.492021 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" podStartSLOduration=3.491999775 podStartE2EDuration="3.491999775s" podCreationTimestamp="2025-10-06 16:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:10:05.489517018 +0000 UTC m=+4602.644809540" watchObservedRunningTime="2025-10-06 16:10:05.491999775 +0000 UTC m=+4602.647292307" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.514045 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" podStartSLOduration=3.514027205 podStartE2EDuration="3.514027205s" podCreationTimestamp="2025-10-06 16:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:10:05.511885306 +0000 UTC m=+4602.667177818" watchObservedRunningTime="2025-10-06 16:10:05.514027205 +0000 UTC m=+4602.669319707" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.624251 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.625320 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.627911 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.634734 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-2p7qx" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.638303 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.772993 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/420576a9-0dd7-4f4d-a043-579ee73ffa30-kolla-config\") pod \"memcached-0\" (UID: \"420576a9-0dd7-4f4d-a043-579ee73ffa30\") " pod="openstack/memcached-0" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.773044 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/420576a9-0dd7-4f4d-a043-579ee73ffa30-config-data\") pod \"memcached-0\" (UID: \"420576a9-0dd7-4f4d-a043-579ee73ffa30\") " pod="openstack/memcached-0" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.773332 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7764\" (UniqueName: \"kubernetes.io/projected/420576a9-0dd7-4f4d-a043-579ee73ffa30-kube-api-access-j7764\") pod \"memcached-0\" (UID: \"420576a9-0dd7-4f4d-a043-579ee73ffa30\") " pod="openstack/memcached-0" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.874694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/420576a9-0dd7-4f4d-a043-579ee73ffa30-kolla-config\") pod \"memcached-0\" (UID: \"420576a9-0dd7-4f4d-a043-579ee73ffa30\") " pod="openstack/memcached-0" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.874754 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/420576a9-0dd7-4f4d-a043-579ee73ffa30-config-data\") pod \"memcached-0\" (UID: \"420576a9-0dd7-4f4d-a043-579ee73ffa30\") " pod="openstack/memcached-0" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.874798 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7764\" (UniqueName: \"kubernetes.io/projected/420576a9-0dd7-4f4d-a043-579ee73ffa30-kube-api-access-j7764\") pod \"memcached-0\" (UID: \"420576a9-0dd7-4f4d-a043-579ee73ffa30\") " pod="openstack/memcached-0" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.875538 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/420576a9-0dd7-4f4d-a043-579ee73ffa30-kolla-config\") pod \"memcached-0\" (UID: \"420576a9-0dd7-4f4d-a043-579ee73ffa30\") " pod="openstack/memcached-0" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.875589 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/420576a9-0dd7-4f4d-a043-579ee73ffa30-config-data\") pod \"memcached-0\" (UID: \"420576a9-0dd7-4f4d-a043-579ee73ffa30\") " pod="openstack/memcached-0" Oct 06 16:10:05 crc kubenswrapper[4763]: I1006 16:10:05.965366 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7764\" (UniqueName: \"kubernetes.io/projected/420576a9-0dd7-4f4d-a043-579ee73ffa30-kube-api-access-j7764\") pod \"memcached-0\" (UID: \"420576a9-0dd7-4f4d-a043-579ee73ffa30\") " pod="openstack/memcached-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.243504 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.482497 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c","Type":"ContainerStarted","Data":"b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d"} Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.487085 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6","Type":"ContainerStarted","Data":"625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b"} Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.780597 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.836820 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.839186 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.845800 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.846165 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.848318 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.850389 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.856334 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nrgrv" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.858589 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.861484 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.870700 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.873388 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.876954 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9tvzt" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.877153 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.877357 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.877389 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.884309 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.993836 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46c419e4-1d20-4b9a-a135-022addb7e278-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.993989 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd5h2\" (UniqueName: \"kubernetes.io/projected/46c419e4-1d20-4b9a-a135-022addb7e278-kube-api-access-jd5h2\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994246 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c419e4-1d20-4b9a-a135-022addb7e278-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994293 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46c419e4-1d20-4b9a-a135-022addb7e278-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994339 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-32b80482-cce6-4608-8dc9-a26cfa7ada75\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32b80482-cce6-4608-8dc9-a26cfa7ada75\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994370 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-89aa7f1a-427d-4ae6-b506-e79d71a2f5b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89aa7f1a-427d-4ae6-b506-e79d71a2f5b3\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994425 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-kolla-config\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994479 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994507 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-config-data-default\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994656 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994704 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994734 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c419e4-1d20-4b9a-a135-022addb7e278-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994764 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.994793 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46c419e4-1d20-4b9a-a135-022addb7e278-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.995052 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jrld\" (UniqueName: \"kubernetes.io/projected/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-kube-api-access-6jrld\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.995096 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46c419e4-1d20-4b9a-a135-022addb7e278-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.995127 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-secrets\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:06 crc kubenswrapper[4763]: I1006 16:10:06.995173 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46c419e4-1d20-4b9a-a135-022addb7e278-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096200 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46c419e4-1d20-4b9a-a135-022addb7e278-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096236 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-secrets\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096254 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46c419e4-1d20-4b9a-a135-022addb7e278-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096273 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46c419e4-1d20-4b9a-a135-022addb7e278-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096298 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd5h2\" (UniqueName: \"kubernetes.io/projected/46c419e4-1d20-4b9a-a135-022addb7e278-kube-api-access-jd5h2\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096320 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c419e4-1d20-4b9a-a135-022addb7e278-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096339 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46c419e4-1d20-4b9a-a135-022addb7e278-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096356 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-32b80482-cce6-4608-8dc9-a26cfa7ada75\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32b80482-cce6-4608-8dc9-a26cfa7ada75\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096715 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-89aa7f1a-427d-4ae6-b506-e79d71a2f5b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89aa7f1a-427d-4ae6-b506-e79d71a2f5b3\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096751 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-kolla-config\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096784 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096812 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-config-data-default\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096886 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096915 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096940 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c419e4-1d20-4b9a-a135-022addb7e278-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096963 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.096990 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46c419e4-1d20-4b9a-a135-022addb7e278-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.097025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jrld\" (UniqueName: \"kubernetes.io/projected/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-kube-api-access-6jrld\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.097145 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46c419e4-1d20-4b9a-a135-022addb7e278-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.097161 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46c419e4-1d20-4b9a-a135-022addb7e278-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.097471 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.097770 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46c419e4-1d20-4b9a-a135-022addb7e278-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.097916 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c419e4-1d20-4b9a-a135-022addb7e278-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.098318 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-kolla-config\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.102356 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-config-data-default\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.102404 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.102812 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-secrets\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.102911 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46c419e4-1d20-4b9a-a135-022addb7e278-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.103303 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.103370 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.103398 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-89aa7f1a-427d-4ae6-b506-e79d71a2f5b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89aa7f1a-427d-4ae6-b506-e79d71a2f5b3\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/02f34fb89f68e6fe685731111a1bb8b748b0d8876af42351f17de77134b43afa/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.103489 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.103555 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-32b80482-cce6-4608-8dc9-a26cfa7ada75\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32b80482-cce6-4608-8dc9-a26cfa7ada75\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ce04a89e6dc9c79c4b655828039791133702fa83752ab0278260df163180b4f8/globalmount\"" pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.104051 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.105667 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c419e4-1d20-4b9a-a135-022addb7e278-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.107399 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46c419e4-1d20-4b9a-a135-022addb7e278-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.113703 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd5h2\" (UniqueName: \"kubernetes.io/projected/46c419e4-1d20-4b9a-a135-022addb7e278-kube-api-access-jd5h2\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.117422 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jrld\" (UniqueName: \"kubernetes.io/projected/8d4bc5a2-36a8-4c9f-9aad-e5b395a74954-kube-api-access-6jrld\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.138403 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-89aa7f1a-427d-4ae6-b506-e79d71a2f5b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89aa7f1a-427d-4ae6-b506-e79d71a2f5b3\") pod \"openstack-cell1-galera-0\" (UID: \"46c419e4-1d20-4b9a-a135-022addb7e278\") " pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.142182 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-32b80482-cce6-4608-8dc9-a26cfa7ada75\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32b80482-cce6-4608-8dc9-a26cfa7ada75\") pod \"openstack-galera-0\" (UID: \"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954\") " pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.203395 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.223797 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.495444 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"420576a9-0dd7-4f4d-a043-579ee73ffa30","Type":"ContainerStarted","Data":"f01fcf86b02fb6359b0e14c0a05ad30be8d0b151c662bf98e52c49c01ad9a855"} Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.495500 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"420576a9-0dd7-4f4d-a043-579ee73ffa30","Type":"ContainerStarted","Data":"f5ea4ec4e600b2954429311d3a65b8bf750577b6f7c1a42ea9a286a7a1c8dbdc"} Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.512301 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.512273948 podStartE2EDuration="2.512273948s" podCreationTimestamp="2025-10-06 16:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:10:07.509217035 +0000 UTC m=+4604.664509637" watchObservedRunningTime="2025-10-06 16:10:07.512273948 +0000 UTC m=+4604.667566470" Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.671430 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 16:10:07 crc kubenswrapper[4763]: W1006 16:10:07.677938 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d4bc5a2_36a8_4c9f_9aad_e5b395a74954.slice/crio-dbe31cd2c13ee4ca074afd3b4f04887d6c1a2a1e7bedc74ce2f8beb6a2851fcb WatchSource:0}: Error finding container dbe31cd2c13ee4ca074afd3b4f04887d6c1a2a1e7bedc74ce2f8beb6a2851fcb: Status 404 returned error can't find the container with id dbe31cd2c13ee4ca074afd3b4f04887d6c1a2a1e7bedc74ce2f8beb6a2851fcb Oct 06 16:10:07 crc kubenswrapper[4763]: I1006 16:10:07.720217 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 16:10:08 crc kubenswrapper[4763]: I1006 16:10:08.506187 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46c419e4-1d20-4b9a-a135-022addb7e278","Type":"ContainerStarted","Data":"45e9b089b3fd76fa0bc5222a0893843e8b6306ac90bb6a92dc04d88c67dedee5"} Oct 06 16:10:08 crc kubenswrapper[4763]: I1006 16:10:08.506386 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46c419e4-1d20-4b9a-a135-022addb7e278","Type":"ContainerStarted","Data":"2b083fdb8b79cce9a80c08f131a2a7c1a29f9be61e1ab1b8810d165a4ac21033"} Oct 06 16:10:08 crc kubenswrapper[4763]: I1006 16:10:08.507058 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954","Type":"ContainerStarted","Data":"f520d5a10aa77a7af12c15f535a4b5c5bd6ca025ca30d8cc74f8a82ed0d3dde8"} Oct 06 16:10:08 crc kubenswrapper[4763]: I1006 16:10:08.507080 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954","Type":"ContainerStarted","Data":"dbe31cd2c13ee4ca074afd3b4f04887d6c1a2a1e7bedc74ce2f8beb6a2851fcb"} Oct 06 16:10:08 crc kubenswrapper[4763]: I1006 16:10:08.507229 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 06 16:10:08 crc kubenswrapper[4763]: I1006 16:10:08.575251 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:10:08 crc kubenswrapper[4763]: E1006 16:10:08.575561 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:10:11 crc kubenswrapper[4763]: I1006 16:10:11.245608 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 06 16:10:11 crc kubenswrapper[4763]: I1006 16:10:11.535650 4763 generic.go:334] "Generic (PLEG): container finished" podID="46c419e4-1d20-4b9a-a135-022addb7e278" containerID="45e9b089b3fd76fa0bc5222a0893843e8b6306ac90bb6a92dc04d88c67dedee5" exitCode=0 Oct 06 16:10:11 crc kubenswrapper[4763]: I1006 16:10:11.535751 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46c419e4-1d20-4b9a-a135-022addb7e278","Type":"ContainerDied","Data":"45e9b089b3fd76fa0bc5222a0893843e8b6306ac90bb6a92dc04d88c67dedee5"} Oct 06 16:10:11 crc kubenswrapper[4763]: I1006 16:10:11.537297 4763 generic.go:334] "Generic (PLEG): container finished" podID="8d4bc5a2-36a8-4c9f-9aad-e5b395a74954" containerID="f520d5a10aa77a7af12c15f535a4b5c5bd6ca025ca30d8cc74f8a82ed0d3dde8" exitCode=0 Oct 06 16:10:11 crc kubenswrapper[4763]: I1006 16:10:11.537360 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954","Type":"ContainerDied","Data":"f520d5a10aa77a7af12c15f535a4b5c5bd6ca025ca30d8cc74f8a82ed0d3dde8"} Oct 06 16:10:12 crc kubenswrapper[4763]: I1006 16:10:12.546664 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46c419e4-1d20-4b9a-a135-022addb7e278","Type":"ContainerStarted","Data":"d7f0807087b70e6b64402f37297d48c586e2fe7bc5f8bf2dae0e22fc8461221a"} Oct 06 16:10:12 crc kubenswrapper[4763]: I1006 16:10:12.549961 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d4bc5a2-36a8-4c9f-9aad-e5b395a74954","Type":"ContainerStarted","Data":"7ab284e9ee57295a6764558294131f337a875377fd4e76ab0021ef536f8e9f75"} Oct 06 16:10:12 crc kubenswrapper[4763]: I1006 16:10:12.578301 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.578280406 podStartE2EDuration="7.578280406s" podCreationTimestamp="2025-10-06 16:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:10:12.571230114 +0000 UTC m=+4609.726522656" watchObservedRunningTime="2025-10-06 16:10:12.578280406 +0000 UTC m=+4609.733572938" Oct 06 16:10:12 crc kubenswrapper[4763]: I1006 16:10:12.618726 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.618706436 podStartE2EDuration="7.618706436s" podCreationTimestamp="2025-10-06 16:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:10:12.598814774 +0000 UTC m=+4609.754107316" watchObservedRunningTime="2025-10-06 16:10:12.618706436 +0000 UTC m=+4609.773998958" Oct 06 16:10:13 crc kubenswrapper[4763]: I1006 16:10:13.064594 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:13 crc kubenswrapper[4763]: I1006 16:10:13.210889 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:10:13 crc kubenswrapper[4763]: I1006 16:10:13.262637 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lrqmk"] Oct 06 16:10:13 crc kubenswrapper[4763]: I1006 16:10:13.555699 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" podUID="7515f331-e24e-4726-a24d-fe1a67cdee83" containerName="dnsmasq-dns" containerID="cri-o://3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61" gracePeriod=10 Oct 06 16:10:13 crc kubenswrapper[4763]: I1006 16:10:13.954152 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.039438 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtwn4\" (UniqueName: \"kubernetes.io/projected/7515f331-e24e-4726-a24d-fe1a67cdee83-kube-api-access-qtwn4\") pod \"7515f331-e24e-4726-a24d-fe1a67cdee83\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.039675 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-dns-svc\") pod \"7515f331-e24e-4726-a24d-fe1a67cdee83\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.039892 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-config\") pod \"7515f331-e24e-4726-a24d-fe1a67cdee83\" (UID: \"7515f331-e24e-4726-a24d-fe1a67cdee83\") " Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.053854 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7515f331-e24e-4726-a24d-fe1a67cdee83-kube-api-access-qtwn4" (OuterVolumeSpecName: "kube-api-access-qtwn4") pod "7515f331-e24e-4726-a24d-fe1a67cdee83" (UID: "7515f331-e24e-4726-a24d-fe1a67cdee83"). InnerVolumeSpecName "kube-api-access-qtwn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.076648 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-config" (OuterVolumeSpecName: "config") pod "7515f331-e24e-4726-a24d-fe1a67cdee83" (UID: "7515f331-e24e-4726-a24d-fe1a67cdee83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.087555 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7515f331-e24e-4726-a24d-fe1a67cdee83" (UID: "7515f331-e24e-4726-a24d-fe1a67cdee83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.141783 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtwn4\" (UniqueName: \"kubernetes.io/projected/7515f331-e24e-4726-a24d-fe1a67cdee83-kube-api-access-qtwn4\") on node \"crc\" DevicePath \"\"" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.142169 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.142340 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7515f331-e24e-4726-a24d-fe1a67cdee83-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.568194 4763 generic.go:334] "Generic (PLEG): container finished" podID="7515f331-e24e-4726-a24d-fe1a67cdee83" containerID="3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61" exitCode=0 Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.568236 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" event={"ID":"7515f331-e24e-4726-a24d-fe1a67cdee83","Type":"ContainerDied","Data":"3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61"} Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.568269 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" event={"ID":"7515f331-e24e-4726-a24d-fe1a67cdee83","Type":"ContainerDied","Data":"67840a1e015c8a84bd21b94ab46aa9191a4a3738d2d583c3a4fd85e7bf0b6f73"} Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.568288 4763 scope.go:117] "RemoveContainer" containerID="3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.568294 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-lrqmk" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.608016 4763 scope.go:117] "RemoveContainer" containerID="3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.623738 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lrqmk"] Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.633659 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lrqmk"] Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.657585 4763 scope.go:117] "RemoveContainer" containerID="3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61" Oct 06 16:10:14 crc kubenswrapper[4763]: E1006 16:10:14.658200 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61\": container with ID starting with 3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61 not found: ID does not exist" containerID="3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.658237 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61"} err="failed to get container status \"3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61\": rpc error: code = NotFound desc = could not find container \"3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61\": container with ID starting with 3f12e5e00959c77c082d95d193dd879f241108660e6f2e0501c0f010b55e4b61 not found: ID does not exist" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.658262 4763 scope.go:117] "RemoveContainer" containerID="3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727" Oct 06 16:10:14 crc kubenswrapper[4763]: E1006 16:10:14.658598 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727\": container with ID starting with 3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727 not found: ID does not exist" containerID="3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727" Oct 06 16:10:14 crc kubenswrapper[4763]: I1006 16:10:14.658665 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727"} err="failed to get container status \"3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727\": rpc error: code = NotFound desc = could not find container \"3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727\": container with ID starting with 3ac91ad567808f073612a2661903103c9dc778846224cc19d94e1bdab3869727 not found: ID does not exist" Oct 06 16:10:15 crc kubenswrapper[4763]: I1006 16:10:15.583731 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7515f331-e24e-4726-a24d-fe1a67cdee83" path="/var/lib/kubelet/pods/7515f331-e24e-4726-a24d-fe1a67cdee83/volumes" Oct 06 16:10:17 crc kubenswrapper[4763]: I1006 16:10:17.204781 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 06 16:10:17 crc kubenswrapper[4763]: I1006 16:10:17.205124 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 06 16:10:17 crc kubenswrapper[4763]: I1006 16:10:17.224420 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:17 crc kubenswrapper[4763]: I1006 16:10:17.224481 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:19 crc kubenswrapper[4763]: I1006 16:10:19.297952 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:19 crc kubenswrapper[4763]: I1006 16:10:19.384316 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 06 16:10:21 crc kubenswrapper[4763]: I1006 16:10:21.292254 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 06 16:10:21 crc kubenswrapper[4763]: I1006 16:10:21.367610 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 06 16:10:22 crc kubenswrapper[4763]: I1006 16:10:22.575853 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:10:22 crc kubenswrapper[4763]: E1006 16:10:22.576510 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:10:37 crc kubenswrapper[4763]: I1006 16:10:37.575099 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:10:37 crc kubenswrapper[4763]: E1006 16:10:37.576023 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:10:38 crc kubenswrapper[4763]: I1006 16:10:38.790340 4763 generic.go:334] "Generic (PLEG): container finished" podID="3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" containerID="b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d" exitCode=0 Oct 06 16:10:38 crc kubenswrapper[4763]: I1006 16:10:38.790476 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c","Type":"ContainerDied","Data":"b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d"} Oct 06 16:10:39 crc kubenswrapper[4763]: I1006 16:10:39.802837 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c","Type":"ContainerStarted","Data":"b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18"} Oct 06 16:10:39 crc kubenswrapper[4763]: I1006 16:10:39.803644 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 16:10:39 crc kubenswrapper[4763]: I1006 16:10:39.805908 4763 generic.go:334] "Generic (PLEG): container finished" podID="c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" containerID="625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b" exitCode=0 Oct 06 16:10:39 crc kubenswrapper[4763]: I1006 16:10:39.805972 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6","Type":"ContainerDied","Data":"625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b"} Oct 06 16:10:39 crc kubenswrapper[4763]: I1006 16:10:39.842160 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.842138097 podStartE2EDuration="37.842138097s" podCreationTimestamp="2025-10-06 16:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:10:39.841797308 +0000 UTC m=+4636.997089870" watchObservedRunningTime="2025-10-06 16:10:39.842138097 +0000 UTC m=+4636.997430649" Oct 06 16:10:40 crc kubenswrapper[4763]: I1006 16:10:40.815816 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6","Type":"ContainerStarted","Data":"121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443"} Oct 06 16:10:40 crc kubenswrapper[4763]: I1006 16:10:40.816335 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:10:40 crc kubenswrapper[4763]: I1006 16:10:40.847508 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.847485332 podStartE2EDuration="38.847485332s" podCreationTimestamp="2025-10-06 16:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:10:40.8407708 +0000 UTC m=+4637.996063362" watchObservedRunningTime="2025-10-06 16:10:40.847485332 +0000 UTC m=+4638.002777864" Oct 06 16:10:52 crc kubenswrapper[4763]: I1006 16:10:52.574573 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:10:52 crc kubenswrapper[4763]: E1006 16:10:52.575182 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:10:54 crc kubenswrapper[4763]: I1006 16:10:54.095777 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 16:10:54 crc kubenswrapper[4763]: I1006 16:10:54.411891 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.091795 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjx7t"] Oct 06 16:11:00 crc kubenswrapper[4763]: E1006 16:11:00.093288 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7515f331-e24e-4726-a24d-fe1a67cdee83" containerName="dnsmasq-dns" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.093313 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7515f331-e24e-4726-a24d-fe1a67cdee83" containerName="dnsmasq-dns" Oct 06 16:11:00 crc kubenswrapper[4763]: E1006 16:11:00.093355 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7515f331-e24e-4726-a24d-fe1a67cdee83" containerName="init" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.093369 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7515f331-e24e-4726-a24d-fe1a67cdee83" containerName="init" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.093689 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7515f331-e24e-4726-a24d-fe1a67cdee83" containerName="dnsmasq-dns" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.095070 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.104819 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjx7t"] Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.208507 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5xg5\" (UniqueName: \"kubernetes.io/projected/1c316e7d-6e65-42c3-b67d-93fc3027979d-kube-api-access-s5xg5\") pod \"dnsmasq-dns-5b7946d7b9-pjx7t\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.208567 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-pjx7t\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.208590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-config\") pod \"dnsmasq-dns-5b7946d7b9-pjx7t\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.309919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5xg5\" (UniqueName: \"kubernetes.io/projected/1c316e7d-6e65-42c3-b67d-93fc3027979d-kube-api-access-s5xg5\") pod \"dnsmasq-dns-5b7946d7b9-pjx7t\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.310229 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-pjx7t\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.310258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-config\") pod \"dnsmasq-dns-5b7946d7b9-pjx7t\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.311309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-config\") pod \"dnsmasq-dns-5b7946d7b9-pjx7t\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.311861 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-pjx7t\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.332372 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5xg5\" (UniqueName: \"kubernetes.io/projected/1c316e7d-6e65-42c3-b67d-93fc3027979d-kube-api-access-s5xg5\") pod \"dnsmasq-dns-5b7946d7b9-pjx7t\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.458231 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.782298 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.910168 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjx7t"] Oct 06 16:11:00 crc kubenswrapper[4763]: I1006 16:11:00.990326 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" event={"ID":"1c316e7d-6e65-42c3-b67d-93fc3027979d","Type":"ContainerStarted","Data":"a06c5feb4065a923a7995d9fb98189e037fa30423beac30bd820785fa5154a61"} Oct 06 16:11:01 crc kubenswrapper[4763]: I1006 16:11:01.419213 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 16:11:01 crc kubenswrapper[4763]: I1006 16:11:01.998408 4763 generic.go:334] "Generic (PLEG): container finished" podID="1c316e7d-6e65-42c3-b67d-93fc3027979d" containerID="4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c" exitCode=0 Oct 06 16:11:01 crc kubenswrapper[4763]: I1006 16:11:01.998490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" event={"ID":"1c316e7d-6e65-42c3-b67d-93fc3027979d","Type":"ContainerDied","Data":"4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c"} Oct 06 16:11:02 crc kubenswrapper[4763]: I1006 16:11:02.470115 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" containerName="rabbitmq" containerID="cri-o://b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18" gracePeriod=604799 Oct 06 16:11:03 crc kubenswrapper[4763]: I1006 16:11:03.009985 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" event={"ID":"1c316e7d-6e65-42c3-b67d-93fc3027979d","Type":"ContainerStarted","Data":"46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af"} Oct 06 16:11:03 crc kubenswrapper[4763]: I1006 16:11:03.010209 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:03 crc kubenswrapper[4763]: I1006 16:11:03.037541 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" podStartSLOduration=3.037520042 podStartE2EDuration="3.037520042s" podCreationTimestamp="2025-10-06 16:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:11:03.037200323 +0000 UTC m=+4660.192492855" watchObservedRunningTime="2025-10-06 16:11:03.037520042 +0000 UTC m=+4660.192812554" Oct 06 16:11:03 crc kubenswrapper[4763]: I1006 16:11:03.111673 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" containerName="rabbitmq" containerID="cri-o://121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443" gracePeriod=604799 Oct 06 16:11:03 crc kubenswrapper[4763]: I1006 16:11:03.584191 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:11:03 crc kubenswrapper[4763]: E1006 16:11:03.584607 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:11:04 crc kubenswrapper[4763]: I1006 16:11:04.093173 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.239:5672: connect: connection refused" Oct 06 16:11:04 crc kubenswrapper[4763]: I1006 16:11:04.409337 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.240:5672: connect: connection refused" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.050755 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.084848 4763 generic.go:334] "Generic (PLEG): container finished" podID="3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" containerID="b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18" exitCode=0 Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.084897 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c","Type":"ContainerDied","Data":"b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18"} Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.084924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c","Type":"ContainerDied","Data":"2eab94587d230d421bd9d659200fb13a5f2b70d58bcdf159c6605bba040fa601"} Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.084944 4763 scope.go:117] "RemoveContainer" containerID="b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.085137 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.105813 4763 scope.go:117] "RemoveContainer" containerID="b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.131902 4763 scope.go:117] "RemoveContainer" containerID="b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18" Oct 06 16:11:09 crc kubenswrapper[4763]: E1006 16:11:09.133840 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18\": container with ID starting with b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18 not found: ID does not exist" containerID="b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.133919 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18"} err="failed to get container status \"b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18\": rpc error: code = NotFound desc = could not find container \"b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18\": container with ID starting with b2ebe57862900441c9ef4e0efec4e9dec0f67016b154e4e5b34292e9ffe00c18 not found: ID does not exist" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.133956 4763 scope.go:117] "RemoveContainer" containerID="b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d" Oct 06 16:11:09 crc kubenswrapper[4763]: E1006 16:11:09.134290 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d\": container with ID starting with b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d not found: ID does not exist" containerID="b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.134322 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d"} err="failed to get container status \"b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d\": rpc error: code = NotFound desc = could not find container \"b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d\": container with ID starting with b87c0a314c0050a26c2acb696458f6ae1c99d879bee07326cc08600e270bc29d not found: ID does not exist" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160088 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-erlang-cookie\") pod \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160206 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-erlang-cookie-secret\") pod \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160240 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-pod-info\") pod \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160276 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-server-conf\") pod \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160387 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") pod \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160430 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs6sg\" (UniqueName: \"kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-kube-api-access-gs6sg\") pod \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160519 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-confd\") pod \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160552 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-plugins\") pod \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160593 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-plugins-conf\") pod \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\" (UID: \"3aff9dac-999e-4ba7-84fd-4c7bf4383a0c\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160777 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" (UID: "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.160929 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.161214 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" (UID: "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.161568 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" (UID: "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.167674 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-pod-info" (OuterVolumeSpecName: "pod-info") pod "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" (UID: "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.167803 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-kube-api-access-gs6sg" (OuterVolumeSpecName: "kube-api-access-gs6sg") pod "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" (UID: "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c"). InnerVolumeSpecName "kube-api-access-gs6sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.169726 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" (UID: "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.175559 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc" (OuterVolumeSpecName: "persistence") pod "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" (UID: "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c"). InnerVolumeSpecName "pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.182669 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-server-conf" (OuterVolumeSpecName: "server-conf") pod "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" (UID: "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.242209 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" (UID: "3aff9dac-999e-4ba7-84fd-4c7bf4383a0c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.262757 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.262988 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.263065 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.263123 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.263186 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.263249 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.263342 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") on node \"crc\" " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.263459 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs6sg\" (UniqueName: \"kubernetes.io/projected/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c-kube-api-access-gs6sg\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.279309 4763 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.279650 4763 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc") on node "crc" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.365222 4763 reconciler_common.go:293] "Volume detached for volume \"pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.420230 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.425504 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.457335 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 16:11:09 crc kubenswrapper[4763]: E1006 16:11:09.457768 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" containerName="setup-container" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.457808 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" containerName="setup-container" Oct 06 16:11:09 crc kubenswrapper[4763]: E1006 16:11:09.457841 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" containerName="rabbitmq" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.457849 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" containerName="rabbitmq" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.458110 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" containerName="rabbitmq" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.459445 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.470208 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.470888 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.471224 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.472426 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qvrql" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.472609 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.472547 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.567524 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b80f7807-fbc1-49a4-9792-293545704695-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.567570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b80f7807-fbc1-49a4-9792-293545704695-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.567589 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.567735 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b80f7807-fbc1-49a4-9792-293545704695-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.567797 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b80f7807-fbc1-49a4-9792-293545704695-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.567814 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b80f7807-fbc1-49a4-9792-293545704695-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.567860 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b80f7807-fbc1-49a4-9792-293545704695-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.567898 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk4g6\" (UniqueName: \"kubernetes.io/projected/b80f7807-fbc1-49a4-9792-293545704695-kube-api-access-zk4g6\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.567927 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b80f7807-fbc1-49a4-9792-293545704695-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.587677 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aff9dac-999e-4ba7-84fd-4c7bf4383a0c" path="/var/lib/kubelet/pods/3aff9dac-999e-4ba7-84fd-4c7bf4383a0c/volumes" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.670352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b80f7807-fbc1-49a4-9792-293545704695-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.670420 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b80f7807-fbc1-49a4-9792-293545704695-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.670467 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b80f7807-fbc1-49a4-9792-293545704695-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.670489 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.670513 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b80f7807-fbc1-49a4-9792-293545704695-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.670587 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b80f7807-fbc1-49a4-9792-293545704695-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.670603 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b80f7807-fbc1-49a4-9792-293545704695-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.670621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b80f7807-fbc1-49a4-9792-293545704695-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.670673 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk4g6\" (UniqueName: \"kubernetes.io/projected/b80f7807-fbc1-49a4-9792-293545704695-kube-api-access-zk4g6\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.672835 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b80f7807-fbc1-49a4-9792-293545704695-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.673664 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b80f7807-fbc1-49a4-9792-293545704695-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.674180 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b80f7807-fbc1-49a4-9792-293545704695-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.674738 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b80f7807-fbc1-49a4-9792-293545704695-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.687675 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b80f7807-fbc1-49a4-9792-293545704695-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.688451 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b80f7807-fbc1-49a4-9792-293545704695-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.688957 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.688993 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/79df105821da558ff8a5667c6a1c58c7bd3de808e4eaa73f23cfde43a2c50242/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.697216 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b80f7807-fbc1-49a4-9792-293545704695-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.712493 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk4g6\" (UniqueName: \"kubernetes.io/projected/b80f7807-fbc1-49a4-9792-293545704695-kube-api-access-zk4g6\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.747905 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eeb0a7c1-da31-4898-9aad-060da20b46cc\") pod \"rabbitmq-server-0\" (UID: \"b80f7807-fbc1-49a4-9792-293545704695\") " pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.780256 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.796201 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.873527 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ksf9\" (UniqueName: \"kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-kube-api-access-9ksf9\") pod \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.873566 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-server-conf\") pod \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.873589 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-plugins\") pod \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.873694 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-pod-info\") pod \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.873736 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-plugins-conf\") pod \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.873753 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-erlang-cookie\") pod \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.873785 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-confd\") pod \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.873875 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") pod \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.873936 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-erlang-cookie-secret\") pod \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\" (UID: \"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6\") " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.876282 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" (UID: "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.877320 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" (UID: "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.879940 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-pod-info" (OuterVolumeSpecName: "pod-info") pod "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" (UID: "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.880226 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" (UID: "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.880509 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" (UID: "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.884432 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-kube-api-access-9ksf9" (OuterVolumeSpecName: "kube-api-access-9ksf9") pod "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" (UID: "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6"). InnerVolumeSpecName "kube-api-access-9ksf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.895646 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903" (OuterVolumeSpecName: "persistence") pod "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" (UID: "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6"). InnerVolumeSpecName "pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.901372 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-server-conf" (OuterVolumeSpecName: "server-conf") pod "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" (UID: "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.956377 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" (UID: "c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.976073 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.976114 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ksf9\" (UniqueName: \"kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-kube-api-access-9ksf9\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.976127 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.976139 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.976151 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.976164 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.976175 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.976185 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.976226 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") on node \"crc\" " Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.995306 4763 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 06 16:11:09 crc kubenswrapper[4763]: I1006 16:11:09.995449 4763 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903") on node "crc" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.053758 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.077677 4763 reconciler_common.go:293] "Volume detached for volume \"pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.107037 4763 generic.go:334] "Generic (PLEG): container finished" podID="c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" containerID="121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443" exitCode=0 Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.107158 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.107163 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6","Type":"ContainerDied","Data":"121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443"} Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.107640 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6","Type":"ContainerDied","Data":"101b794579928acb128b24c8a0c6abcda6bf21657adbc9e53d3d78b66567b53f"} Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.107661 4763 scope.go:117] "RemoveContainer" containerID="121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.110696 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b80f7807-fbc1-49a4-9792-293545704695","Type":"ContainerStarted","Data":"ff8e7e5f37ab15d4d1f0a15f0dcbfc745e38aadbecce8514bad4ec5d292ab70f"} Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.127261 4763 scope.go:117] "RemoveContainer" containerID="625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.154871 4763 scope.go:117] "RemoveContainer" containerID="121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443" Oct 06 16:11:10 crc kubenswrapper[4763]: E1006 16:11:10.157172 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443\": container with ID starting with 121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443 not found: ID does not exist" containerID="121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.157216 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443"} err="failed to get container status \"121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443\": rpc error: code = NotFound desc = could not find container \"121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443\": container with ID starting with 121183f5ff5eec36d8b95cd6d80fe4780a61899c90a0b36ac2000d90837e3443 not found: ID does not exist" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.157243 4763 scope.go:117] "RemoveContainer" containerID="625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b" Oct 06 16:11:10 crc kubenswrapper[4763]: E1006 16:11:10.157727 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b\": container with ID starting with 625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b not found: ID does not exist" containerID="625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.157871 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b"} err="failed to get container status \"625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b\": rpc error: code = NotFound desc = could not find container \"625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b\": container with ID starting with 625a539886226727232598bb185bb18b7b84a4435a75d2bab5e3c9f74f248e1b not found: ID does not exist" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.159425 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.164200 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.179841 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 16:11:10 crc kubenswrapper[4763]: E1006 16:11:10.180160 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" containerName="setup-container" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.180171 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" containerName="setup-container" Oct 06 16:11:10 crc kubenswrapper[4763]: E1006 16:11:10.180186 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" containerName="rabbitmq" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.180193 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" containerName="rabbitmq" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.180369 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" containerName="rabbitmq" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.181159 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.183356 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.184154 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.184613 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.184831 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-g8djm" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.184972 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.191660 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.281004 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.281078 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46fvk\" (UniqueName: \"kubernetes.io/projected/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-kube-api-access-46fvk\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.281179 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.281260 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.281318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.281501 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.281541 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.281597 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.281680 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.382723 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.382794 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.382862 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.382909 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.382957 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.383022 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46fvk\" (UniqueName: \"kubernetes.io/projected/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-kube-api-access-46fvk\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.383078 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.383110 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.383152 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.384168 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.384543 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.385986 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.386227 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.386765 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.386817 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a9f1abfb4440df30f4c33d4ec8b540cf45ddbcc43283db36e58f92db789df22e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.387164 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.388258 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.390157 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.403806 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46fvk\" (UniqueName: \"kubernetes.io/projected/5b5a728e-4fec-45c7-866e-a8dd895c6a2b-kube-api-access-46fvk\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.429367 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fea0408c-b5f9-4d6d-b671-9be9c8da8903\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b5a728e-4fec-45c7-866e-a8dd895c6a2b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.459832 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.502982 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.510173 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-pt8b7"] Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.510511 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" podUID="b1f698bb-55ff-4199-afcd-b7d16ac14754" containerName="dnsmasq-dns" containerID="cri-o://0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1" gracePeriod=10 Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.943753 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.983590 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.992698 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljsbb\" (UniqueName: \"kubernetes.io/projected/b1f698bb-55ff-4199-afcd-b7d16ac14754-kube-api-access-ljsbb\") pod \"b1f698bb-55ff-4199-afcd-b7d16ac14754\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.992919 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-config\") pod \"b1f698bb-55ff-4199-afcd-b7d16ac14754\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " Oct 06 16:11:10 crc kubenswrapper[4763]: I1006 16:11:10.993055 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-dns-svc\") pod \"b1f698bb-55ff-4199-afcd-b7d16ac14754\" (UID: \"b1f698bb-55ff-4199-afcd-b7d16ac14754\") " Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.065922 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f698bb-55ff-4199-afcd-b7d16ac14754-kube-api-access-ljsbb" (OuterVolumeSpecName: "kube-api-access-ljsbb") pod "b1f698bb-55ff-4199-afcd-b7d16ac14754" (UID: "b1f698bb-55ff-4199-afcd-b7d16ac14754"). InnerVolumeSpecName "kube-api-access-ljsbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.095060 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljsbb\" (UniqueName: \"kubernetes.io/projected/b1f698bb-55ff-4199-afcd-b7d16ac14754-kube-api-access-ljsbb\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.107703 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-config" (OuterVolumeSpecName: "config") pod "b1f698bb-55ff-4199-afcd-b7d16ac14754" (UID: "b1f698bb-55ff-4199-afcd-b7d16ac14754"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.107963 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1f698bb-55ff-4199-afcd-b7d16ac14754" (UID: "b1f698bb-55ff-4199-afcd-b7d16ac14754"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.118921 4763 generic.go:334] "Generic (PLEG): container finished" podID="b1f698bb-55ff-4199-afcd-b7d16ac14754" containerID="0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1" exitCode=0 Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.119028 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" event={"ID":"b1f698bb-55ff-4199-afcd-b7d16ac14754","Type":"ContainerDied","Data":"0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1"} Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.119099 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" event={"ID":"b1f698bb-55ff-4199-afcd-b7d16ac14754","Type":"ContainerDied","Data":"6f17e680e09a6203fb10910a293ea0805ac370109c6da283cc0ee8c8553595f8"} Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.119051 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-pt8b7" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.119126 4763 scope.go:117] "RemoveContainer" containerID="0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.120325 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5b5a728e-4fec-45c7-866e-a8dd895c6a2b","Type":"ContainerStarted","Data":"f38cb91be91c8e4857baac2dc3322db4e033731a215c95d8dc540447252b770d"} Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.167106 4763 scope.go:117] "RemoveContainer" containerID="120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.191033 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-pt8b7"] Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.193903 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-pt8b7"] Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.196428 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.196466 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f698bb-55ff-4199-afcd-b7d16ac14754-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.202609 4763 scope.go:117] "RemoveContainer" containerID="0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1" Oct 06 16:11:11 crc kubenswrapper[4763]: E1006 16:11:11.203038 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1\": container with ID starting with 0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1 not found: ID does not exist" containerID="0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.203085 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1"} err="failed to get container status \"0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1\": rpc error: code = NotFound desc = could not find container \"0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1\": container with ID starting with 0de4f6f045edaea5e64f8eb8feac3fcf4c02548e7522304f1d75e5b9a85921c1 not found: ID does not exist" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.203122 4763 scope.go:117] "RemoveContainer" containerID="120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5" Oct 06 16:11:11 crc kubenswrapper[4763]: E1006 16:11:11.203466 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5\": container with ID starting with 120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5 not found: ID does not exist" containerID="120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.203493 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5"} err="failed to get container status \"120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5\": rpc error: code = NotFound desc = could not find container \"120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5\": container with ID starting with 120db9cc572d5f3816813cbf7402e57bfa74b51fcc525847ed3822191b047ee5 not found: ID does not exist" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.583741 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f698bb-55ff-4199-afcd-b7d16ac14754" path="/var/lib/kubelet/pods/b1f698bb-55ff-4199-afcd-b7d16ac14754/volumes" Oct 06 16:11:11 crc kubenswrapper[4763]: I1006 16:11:11.584825 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6" path="/var/lib/kubelet/pods/c0a5a7fc-7e4b-4b0e-a83e-ba5e77659fb6/volumes" Oct 06 16:11:12 crc kubenswrapper[4763]: I1006 16:11:12.137514 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b80f7807-fbc1-49a4-9792-293545704695","Type":"ContainerStarted","Data":"1044a1409b8375ca03f64557549ab63bba43ffea2eda81e0d26916291f47f716"} Oct 06 16:11:13 crc kubenswrapper[4763]: I1006 16:11:13.150937 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5b5a728e-4fec-45c7-866e-a8dd895c6a2b","Type":"ContainerStarted","Data":"665a9a37a8e82f18cf7416f079bbc3dc7360be10a046b54799da243d720a7fb3"} Oct 06 16:11:16 crc kubenswrapper[4763]: I1006 16:11:16.575666 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:11:16 crc kubenswrapper[4763]: E1006 16:11:16.576479 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:11:28 crc kubenswrapper[4763]: I1006 16:11:28.575847 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:11:28 crc kubenswrapper[4763]: E1006 16:11:28.576851 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:11:41 crc kubenswrapper[4763]: I1006 16:11:41.576150 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:11:41 crc kubenswrapper[4763]: E1006 16:11:41.577284 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:11:44 crc kubenswrapper[4763]: I1006 16:11:44.447116 4763 generic.go:334] "Generic (PLEG): container finished" podID="b80f7807-fbc1-49a4-9792-293545704695" containerID="1044a1409b8375ca03f64557549ab63bba43ffea2eda81e0d26916291f47f716" exitCode=0 Oct 06 16:11:44 crc kubenswrapper[4763]: I1006 16:11:44.447216 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b80f7807-fbc1-49a4-9792-293545704695","Type":"ContainerDied","Data":"1044a1409b8375ca03f64557549ab63bba43ffea2eda81e0d26916291f47f716"} Oct 06 16:11:45 crc kubenswrapper[4763]: I1006 16:11:45.461511 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b80f7807-fbc1-49a4-9792-293545704695","Type":"ContainerStarted","Data":"ce1b2c8579b763bc05348e3aec2508616612a79f929e5bb22383c0a79ac8eee4"} Oct 06 16:11:45 crc kubenswrapper[4763]: I1006 16:11:45.462709 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 16:11:45 crc kubenswrapper[4763]: I1006 16:11:45.487812 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.487791619 podStartE2EDuration="36.487791619s" podCreationTimestamp="2025-10-06 16:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:11:45.48525825 +0000 UTC m=+4702.640550782" watchObservedRunningTime="2025-10-06 16:11:45.487791619 +0000 UTC m=+4702.643084141" Oct 06 16:11:46 crc kubenswrapper[4763]: I1006 16:11:46.470817 4763 generic.go:334] "Generic (PLEG): container finished" podID="5b5a728e-4fec-45c7-866e-a8dd895c6a2b" containerID="665a9a37a8e82f18cf7416f079bbc3dc7360be10a046b54799da243d720a7fb3" exitCode=0 Oct 06 16:11:46 crc kubenswrapper[4763]: I1006 16:11:46.470889 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5b5a728e-4fec-45c7-866e-a8dd895c6a2b","Type":"ContainerDied","Data":"665a9a37a8e82f18cf7416f079bbc3dc7360be10a046b54799da243d720a7fb3"} Oct 06 16:11:47 crc kubenswrapper[4763]: I1006 16:11:47.487327 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5b5a728e-4fec-45c7-866e-a8dd895c6a2b","Type":"ContainerStarted","Data":"b35c034d7870a5089fbb0888da79c0aa5d219e572c94de2c4296110896628b94"} Oct 06 16:11:47 crc kubenswrapper[4763]: I1006 16:11:47.487950 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:11:47 crc kubenswrapper[4763]: I1006 16:11:47.520580 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.520551942 podStartE2EDuration="37.520551942s" podCreationTimestamp="2025-10-06 16:11:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:11:47.512004249 +0000 UTC m=+4704.667296801" watchObservedRunningTime="2025-10-06 16:11:47.520551942 +0000 UTC m=+4704.675844504" Oct 06 16:11:52 crc kubenswrapper[4763]: I1006 16:11:52.575185 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:11:52 crc kubenswrapper[4763]: E1006 16:11:52.575730 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:11:59 crc kubenswrapper[4763]: I1006 16:11:59.783034 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 16:12:00 crc kubenswrapper[4763]: I1006 16:12:00.505998 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 16:12:06 crc kubenswrapper[4763]: I1006 16:12:06.575744 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:12:06 crc kubenswrapper[4763]: E1006 16:12:06.576913 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.625401 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 06 16:12:07 crc kubenswrapper[4763]: E1006 16:12:07.625833 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f698bb-55ff-4199-afcd-b7d16ac14754" containerName="dnsmasq-dns" Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.625856 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f698bb-55ff-4199-afcd-b7d16ac14754" containerName="dnsmasq-dns" Oct 06 16:12:07 crc kubenswrapper[4763]: E1006 16:12:07.625904 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f698bb-55ff-4199-afcd-b7d16ac14754" containerName="init" Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.625916 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f698bb-55ff-4199-afcd-b7d16ac14754" containerName="init" Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.626197 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f698bb-55ff-4199-afcd-b7d16ac14754" containerName="dnsmasq-dns" Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.627072 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.627194 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.631715 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r7b94" Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.757954 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbcw6\" (UniqueName: \"kubernetes.io/projected/e09c24c2-0fbf-473f-a151-d0dfa88b87b4-kube-api-access-gbcw6\") pod \"mariadb-client-1-default\" (UID: \"e09c24c2-0fbf-473f-a151-d0dfa88b87b4\") " pod="openstack/mariadb-client-1-default" Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.860472 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbcw6\" (UniqueName: \"kubernetes.io/projected/e09c24c2-0fbf-473f-a151-d0dfa88b87b4-kube-api-access-gbcw6\") pod \"mariadb-client-1-default\" (UID: \"e09c24c2-0fbf-473f-a151-d0dfa88b87b4\") " pod="openstack/mariadb-client-1-default" Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.887021 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbcw6\" (UniqueName: \"kubernetes.io/projected/e09c24c2-0fbf-473f-a151-d0dfa88b87b4-kube-api-access-gbcw6\") pod \"mariadb-client-1-default\" (UID: \"e09c24c2-0fbf-473f-a151-d0dfa88b87b4\") " pod="openstack/mariadb-client-1-default" Oct 06 16:12:07 crc kubenswrapper[4763]: I1006 16:12:07.963837 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 06 16:12:08 crc kubenswrapper[4763]: I1006 16:12:08.478393 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 06 16:12:08 crc kubenswrapper[4763]: W1006 16:12:08.487630 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode09c24c2_0fbf_473f_a151_d0dfa88b87b4.slice/crio-a396b86e3eeb0c0d0327ccf96f5c7deacf02db157e23a5ff391825d2270e0d43 WatchSource:0}: Error finding container a396b86e3eeb0c0d0327ccf96f5c7deacf02db157e23a5ff391825d2270e0d43: Status 404 returned error can't find the container with id a396b86e3eeb0c0d0327ccf96f5c7deacf02db157e23a5ff391825d2270e0d43 Oct 06 16:12:08 crc kubenswrapper[4763]: I1006 16:12:08.672499 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"e09c24c2-0fbf-473f-a151-d0dfa88b87b4","Type":"ContainerStarted","Data":"1e0f7ae927c323cdf19bf59a53cfb5b3823819c808001f5bf370b78a34b814bd"} Oct 06 16:12:08 crc kubenswrapper[4763]: I1006 16:12:08.672562 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"e09c24c2-0fbf-473f-a151-d0dfa88b87b4","Type":"ContainerStarted","Data":"a396b86e3eeb0c0d0327ccf96f5c7deacf02db157e23a5ff391825d2270e0d43"} Oct 06 16:12:08 crc kubenswrapper[4763]: I1006 16:12:08.687663 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-1-default" podStartSLOduration=1.687647375 podStartE2EDuration="1.687647375s" podCreationTimestamp="2025-10-06 16:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:12:08.685427034 +0000 UTC m=+4725.840719536" watchObservedRunningTime="2025-10-06 16:12:08.687647375 +0000 UTC m=+4725.842939887" Oct 06 16:12:08 crc kubenswrapper[4763]: I1006 16:12:08.744536 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_e09c24c2-0fbf-473f-a151-d0dfa88b87b4/mariadb-client-1-default/0.log" Oct 06 16:12:09 crc kubenswrapper[4763]: I1006 16:12:09.681813 4763 generic.go:334] "Generic (PLEG): container finished" podID="e09c24c2-0fbf-473f-a151-d0dfa88b87b4" containerID="1e0f7ae927c323cdf19bf59a53cfb5b3823819c808001f5bf370b78a34b814bd" exitCode=0 Oct 06 16:12:09 crc kubenswrapper[4763]: I1006 16:12:09.681857 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"e09c24c2-0fbf-473f-a151-d0dfa88b87b4","Type":"ContainerDied","Data":"1e0f7ae927c323cdf19bf59a53cfb5b3823819c808001f5bf370b78a34b814bd"} Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.134397 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.177028 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.184238 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.216419 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbcw6\" (UniqueName: \"kubernetes.io/projected/e09c24c2-0fbf-473f-a151-d0dfa88b87b4-kube-api-access-gbcw6\") pod \"e09c24c2-0fbf-473f-a151-d0dfa88b87b4\" (UID: \"e09c24c2-0fbf-473f-a151-d0dfa88b87b4\") " Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.221826 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09c24c2-0fbf-473f-a151-d0dfa88b87b4-kube-api-access-gbcw6" (OuterVolumeSpecName: "kube-api-access-gbcw6") pod "e09c24c2-0fbf-473f-a151-d0dfa88b87b4" (UID: "e09c24c2-0fbf-473f-a151-d0dfa88b87b4"). InnerVolumeSpecName "kube-api-access-gbcw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.320404 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbcw6\" (UniqueName: \"kubernetes.io/projected/e09c24c2-0fbf-473f-a151-d0dfa88b87b4-kube-api-access-gbcw6\") on node \"crc\" DevicePath \"\"" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.591868 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09c24c2-0fbf-473f-a151-d0dfa88b87b4" path="/var/lib/kubelet/pods/e09c24c2-0fbf-473f-a151-d0dfa88b87b4/volumes" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.710558 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 06 16:12:11 crc kubenswrapper[4763]: E1006 16:12:11.711870 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09c24c2-0fbf-473f-a151-d0dfa88b87b4" containerName="mariadb-client-1-default" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.711914 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09c24c2-0fbf-473f-a151-d0dfa88b87b4" containerName="mariadb-client-1-default" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.712702 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09c24c2-0fbf-473f-a151-d0dfa88b87b4" containerName="mariadb-client-1-default" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.714599 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.728752 4763 scope.go:117] "RemoveContainer" containerID="1e0f7ae927c323cdf19bf59a53cfb5b3823819c808001f5bf370b78a34b814bd" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.728954 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.733979 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.829256 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w8r8\" (UniqueName: \"kubernetes.io/projected/0e8974eb-9fb0-405e-88d6-9afaef2bc955-kube-api-access-8w8r8\") pod \"mariadb-client-2-default\" (UID: \"0e8974eb-9fb0-405e-88d6-9afaef2bc955\") " pod="openstack/mariadb-client-2-default" Oct 06 16:12:11 crc kubenswrapper[4763]: I1006 16:12:11.930770 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w8r8\" (UniqueName: \"kubernetes.io/projected/0e8974eb-9fb0-405e-88d6-9afaef2bc955-kube-api-access-8w8r8\") pod \"mariadb-client-2-default\" (UID: \"0e8974eb-9fb0-405e-88d6-9afaef2bc955\") " pod="openstack/mariadb-client-2-default" Oct 06 16:12:12 crc kubenswrapper[4763]: I1006 16:12:12.368152 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w8r8\" (UniqueName: \"kubernetes.io/projected/0e8974eb-9fb0-405e-88d6-9afaef2bc955-kube-api-access-8w8r8\") pod \"mariadb-client-2-default\" (UID: \"0e8974eb-9fb0-405e-88d6-9afaef2bc955\") " pod="openstack/mariadb-client-2-default" Oct 06 16:12:12 crc kubenswrapper[4763]: I1006 16:12:12.392243 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 06 16:12:13 crc kubenswrapper[4763]: I1006 16:12:12.945462 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 06 16:12:13 crc kubenswrapper[4763]: W1006 16:12:12.955778 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e8974eb_9fb0_405e_88d6_9afaef2bc955.slice/crio-9c2954b4e75e40e2f654251c17209214f3b374774c1f57eb0771c4acc377c0db WatchSource:0}: Error finding container 9c2954b4e75e40e2f654251c17209214f3b374774c1f57eb0771c4acc377c0db: Status 404 returned error can't find the container with id 9c2954b4e75e40e2f654251c17209214f3b374774c1f57eb0771c4acc377c0db Oct 06 16:12:13 crc kubenswrapper[4763]: I1006 16:12:13.745196 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"0e8974eb-9fb0-405e-88d6-9afaef2bc955","Type":"ContainerStarted","Data":"e891ede40f6595ded8bfd1aec0bf740feb8834f22c0681135b7c2a8120eecbfc"} Oct 06 16:12:13 crc kubenswrapper[4763]: I1006 16:12:13.745595 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"0e8974eb-9fb0-405e-88d6-9afaef2bc955","Type":"ContainerStarted","Data":"9c2954b4e75e40e2f654251c17209214f3b374774c1f57eb0771c4acc377c0db"} Oct 06 16:12:13 crc kubenswrapper[4763]: I1006 16:12:13.762779 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=2.762758389 podStartE2EDuration="2.762758389s" podCreationTimestamp="2025-10-06 16:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:12:13.757226139 +0000 UTC m=+4730.912518651" watchObservedRunningTime="2025-10-06 16:12:13.762758389 +0000 UTC m=+4730.918050901" Oct 06 16:12:14 crc kubenswrapper[4763]: I1006 16:12:14.755720 4763 generic.go:334] "Generic (PLEG): container finished" podID="0e8974eb-9fb0-405e-88d6-9afaef2bc955" containerID="e891ede40f6595ded8bfd1aec0bf740feb8834f22c0681135b7c2a8120eecbfc" exitCode=0 Oct 06 16:12:14 crc kubenswrapper[4763]: I1006 16:12:14.755787 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"0e8974eb-9fb0-405e-88d6-9afaef2bc955","Type":"ContainerDied","Data":"e891ede40f6595ded8bfd1aec0bf740feb8834f22c0681135b7c2a8120eecbfc"} Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.148743 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.185219 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.194227 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.200295 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w8r8\" (UniqueName: \"kubernetes.io/projected/0e8974eb-9fb0-405e-88d6-9afaef2bc955-kube-api-access-8w8r8\") pod \"0e8974eb-9fb0-405e-88d6-9afaef2bc955\" (UID: \"0e8974eb-9fb0-405e-88d6-9afaef2bc955\") " Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.205070 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e8974eb-9fb0-405e-88d6-9afaef2bc955-kube-api-access-8w8r8" (OuterVolumeSpecName: "kube-api-access-8w8r8") pod "0e8974eb-9fb0-405e-88d6-9afaef2bc955" (UID: "0e8974eb-9fb0-405e-88d6-9afaef2bc955"). InnerVolumeSpecName "kube-api-access-8w8r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.302249 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w8r8\" (UniqueName: \"kubernetes.io/projected/0e8974eb-9fb0-405e-88d6-9afaef2bc955-kube-api-access-8w8r8\") on node \"crc\" DevicePath \"\"" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.632736 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 06 16:12:16 crc kubenswrapper[4763]: E1006 16:12:16.633296 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8974eb-9fb0-405e-88d6-9afaef2bc955" containerName="mariadb-client-2-default" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.633378 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8974eb-9fb0-405e-88d6-9afaef2bc955" containerName="mariadb-client-2-default" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.633704 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e8974eb-9fb0-405e-88d6-9afaef2bc955" containerName="mariadb-client-2-default" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.634328 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.634487 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.707052 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvc2p\" (UniqueName: \"kubernetes.io/projected/7977a95d-1b8e-47f1-a3a7-7a245f6940bf-kube-api-access-pvc2p\") pod \"mariadb-client-1\" (UID: \"7977a95d-1b8e-47f1-a3a7-7a245f6940bf\") " pod="openstack/mariadb-client-1" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.771602 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c2954b4e75e40e2f654251c17209214f3b374774c1f57eb0771c4acc377c0db" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.771719 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.808242 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvc2p\" (UniqueName: \"kubernetes.io/projected/7977a95d-1b8e-47f1-a3a7-7a245f6940bf-kube-api-access-pvc2p\") pod \"mariadb-client-1\" (UID: \"7977a95d-1b8e-47f1-a3a7-7a245f6940bf\") " pod="openstack/mariadb-client-1" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.827504 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvc2p\" (UniqueName: \"kubernetes.io/projected/7977a95d-1b8e-47f1-a3a7-7a245f6940bf-kube-api-access-pvc2p\") pod \"mariadb-client-1\" (UID: \"7977a95d-1b8e-47f1-a3a7-7a245f6940bf\") " pod="openstack/mariadb-client-1" Oct 06 16:12:16 crc kubenswrapper[4763]: I1006 16:12:16.986367 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 06 16:12:17 crc kubenswrapper[4763]: I1006 16:12:17.483311 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 06 16:12:17 crc kubenswrapper[4763]: I1006 16:12:17.588069 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e8974eb-9fb0-405e-88d6-9afaef2bc955" path="/var/lib/kubelet/pods/0e8974eb-9fb0-405e-88d6-9afaef2bc955/volumes" Oct 06 16:12:17 crc kubenswrapper[4763]: I1006 16:12:17.783420 4763 generic.go:334] "Generic (PLEG): container finished" podID="7977a95d-1b8e-47f1-a3a7-7a245f6940bf" containerID="7dd26f4bd02a1d23825dc5032da2d365455e2fa0bc80db284a5c6edb8cabc5bd" exitCode=0 Oct 06 16:12:17 crc kubenswrapper[4763]: I1006 16:12:17.783503 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"7977a95d-1b8e-47f1-a3a7-7a245f6940bf","Type":"ContainerDied","Data":"7dd26f4bd02a1d23825dc5032da2d365455e2fa0bc80db284a5c6edb8cabc5bd"} Oct 06 16:12:17 crc kubenswrapper[4763]: I1006 16:12:17.783668 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"7977a95d-1b8e-47f1-a3a7-7a245f6940bf","Type":"ContainerStarted","Data":"7bebbe5a462dc410da7c24015d805d73060afbf621f06321dc33cebf8b54955f"} Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.275459 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.300991 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_7977a95d-1b8e-47f1-a3a7-7a245f6940bf/mariadb-client-1/0.log" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.330422 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.342268 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.350278 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvc2p\" (UniqueName: \"kubernetes.io/projected/7977a95d-1b8e-47f1-a3a7-7a245f6940bf-kube-api-access-pvc2p\") pod \"7977a95d-1b8e-47f1-a3a7-7a245f6940bf\" (UID: \"7977a95d-1b8e-47f1-a3a7-7a245f6940bf\") " Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.355897 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7977a95d-1b8e-47f1-a3a7-7a245f6940bf-kube-api-access-pvc2p" (OuterVolumeSpecName: "kube-api-access-pvc2p") pod "7977a95d-1b8e-47f1-a3a7-7a245f6940bf" (UID: "7977a95d-1b8e-47f1-a3a7-7a245f6940bf"). InnerVolumeSpecName "kube-api-access-pvc2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.452412 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvc2p\" (UniqueName: \"kubernetes.io/projected/7977a95d-1b8e-47f1-a3a7-7a245f6940bf-kube-api-access-pvc2p\") on node \"crc\" DevicePath \"\"" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.592232 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7977a95d-1b8e-47f1-a3a7-7a245f6940bf" path="/var/lib/kubelet/pods/7977a95d-1b8e-47f1-a3a7-7a245f6940bf/volumes" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.735251 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 06 16:12:19 crc kubenswrapper[4763]: E1006 16:12:19.735600 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7977a95d-1b8e-47f1-a3a7-7a245f6940bf" containerName="mariadb-client-1" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.735616 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7977a95d-1b8e-47f1-a3a7-7a245f6940bf" containerName="mariadb-client-1" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.735761 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7977a95d-1b8e-47f1-a3a7-7a245f6940bf" containerName="mariadb-client-1" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.736253 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.744237 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.757125 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfcgv\" (UniqueName: \"kubernetes.io/projected/cb0867ed-e1dd-4144-a13a-b3174776d529-kube-api-access-sfcgv\") pod \"mariadb-client-4-default\" (UID: \"cb0867ed-e1dd-4144-a13a-b3174776d529\") " pod="openstack/mariadb-client-4-default" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.804595 4763 scope.go:117] "RemoveContainer" containerID="7dd26f4bd02a1d23825dc5032da2d365455e2fa0bc80db284a5c6edb8cabc5bd" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.804801 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.858648 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfcgv\" (UniqueName: \"kubernetes.io/projected/cb0867ed-e1dd-4144-a13a-b3174776d529-kube-api-access-sfcgv\") pod \"mariadb-client-4-default\" (UID: \"cb0867ed-e1dd-4144-a13a-b3174776d529\") " pod="openstack/mariadb-client-4-default" Oct 06 16:12:19 crc kubenswrapper[4763]: I1006 16:12:19.878014 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfcgv\" (UniqueName: \"kubernetes.io/projected/cb0867ed-e1dd-4144-a13a-b3174776d529-kube-api-access-sfcgv\") pod \"mariadb-client-4-default\" (UID: \"cb0867ed-e1dd-4144-a13a-b3174776d529\") " pod="openstack/mariadb-client-4-default" Oct 06 16:12:20 crc kubenswrapper[4763]: I1006 16:12:20.060157 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 06 16:12:20 crc kubenswrapper[4763]: I1006 16:12:20.574864 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:12:20 crc kubenswrapper[4763]: E1006 16:12:20.575570 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:12:20 crc kubenswrapper[4763]: I1006 16:12:20.671991 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 06 16:12:20 crc kubenswrapper[4763]: W1006 16:12:20.675781 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb0867ed_e1dd_4144_a13a_b3174776d529.slice/crio-8cbd72799522399528c2c2fed933588ef58db1c4ecce150ef62ff50b00075931 WatchSource:0}: Error finding container 8cbd72799522399528c2c2fed933588ef58db1c4ecce150ef62ff50b00075931: Status 404 returned error can't find the container with id 8cbd72799522399528c2c2fed933588ef58db1c4ecce150ef62ff50b00075931 Oct 06 16:12:20 crc kubenswrapper[4763]: I1006 16:12:20.816207 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"cb0867ed-e1dd-4144-a13a-b3174776d529","Type":"ContainerStarted","Data":"8cbd72799522399528c2c2fed933588ef58db1c4ecce150ef62ff50b00075931"} Oct 06 16:12:21 crc kubenswrapper[4763]: I1006 16:12:21.827545 4763 generic.go:334] "Generic (PLEG): container finished" podID="cb0867ed-e1dd-4144-a13a-b3174776d529" containerID="fe5137cd9675cfa4bc710b3520157a568eede55f15f5d6459e234b1c9f94a416" exitCode=0 Oct 06 16:12:21 crc kubenswrapper[4763]: I1006 16:12:21.827701 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"cb0867ed-e1dd-4144-a13a-b3174776d529","Type":"ContainerDied","Data":"fe5137cd9675cfa4bc710b3520157a568eede55f15f5d6459e234b1c9f94a416"} Oct 06 16:12:23 crc kubenswrapper[4763]: I1006 16:12:23.223130 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 06 16:12:23 crc kubenswrapper[4763]: I1006 16:12:23.245699 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_cb0867ed-e1dd-4144-a13a-b3174776d529/mariadb-client-4-default/0.log" Oct 06 16:12:23 crc kubenswrapper[4763]: I1006 16:12:23.273917 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 06 16:12:23 crc kubenswrapper[4763]: I1006 16:12:23.280473 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 06 16:12:23 crc kubenswrapper[4763]: I1006 16:12:23.312992 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfcgv\" (UniqueName: \"kubernetes.io/projected/cb0867ed-e1dd-4144-a13a-b3174776d529-kube-api-access-sfcgv\") pod \"cb0867ed-e1dd-4144-a13a-b3174776d529\" (UID: \"cb0867ed-e1dd-4144-a13a-b3174776d529\") " Oct 06 16:12:23 crc kubenswrapper[4763]: I1006 16:12:23.318554 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb0867ed-e1dd-4144-a13a-b3174776d529-kube-api-access-sfcgv" (OuterVolumeSpecName: "kube-api-access-sfcgv") pod "cb0867ed-e1dd-4144-a13a-b3174776d529" (UID: "cb0867ed-e1dd-4144-a13a-b3174776d529"). InnerVolumeSpecName "kube-api-access-sfcgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:12:23 crc kubenswrapper[4763]: I1006 16:12:23.414837 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfcgv\" (UniqueName: \"kubernetes.io/projected/cb0867ed-e1dd-4144-a13a-b3174776d529-kube-api-access-sfcgv\") on node \"crc\" DevicePath \"\"" Oct 06 16:12:23 crc kubenswrapper[4763]: I1006 16:12:23.593443 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb0867ed-e1dd-4144-a13a-b3174776d529" path="/var/lib/kubelet/pods/cb0867ed-e1dd-4144-a13a-b3174776d529/volumes" Oct 06 16:12:23 crc kubenswrapper[4763]: I1006 16:12:23.846284 4763 scope.go:117] "RemoveContainer" containerID="fe5137cd9675cfa4bc710b3520157a568eede55f15f5d6459e234b1c9f94a416" Oct 06 16:12:23 crc kubenswrapper[4763]: I1006 16:12:23.846364 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 06 16:12:27 crc kubenswrapper[4763]: I1006 16:12:27.357464 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 06 16:12:27 crc kubenswrapper[4763]: E1006 16:12:27.358243 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0867ed-e1dd-4144-a13a-b3174776d529" containerName="mariadb-client-4-default" Oct 06 16:12:27 crc kubenswrapper[4763]: I1006 16:12:27.358254 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0867ed-e1dd-4144-a13a-b3174776d529" containerName="mariadb-client-4-default" Oct 06 16:12:27 crc kubenswrapper[4763]: I1006 16:12:27.358421 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb0867ed-e1dd-4144-a13a-b3174776d529" containerName="mariadb-client-4-default" Oct 06 16:12:27 crc kubenswrapper[4763]: I1006 16:12:27.358884 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 06 16:12:27 crc kubenswrapper[4763]: I1006 16:12:27.361025 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r7b94" Oct 06 16:12:27 crc kubenswrapper[4763]: I1006 16:12:27.378794 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt258\" (UniqueName: \"kubernetes.io/projected/cbb8d788-fb78-4ed5-8be2-7cfcfee11e77-kube-api-access-dt258\") pod \"mariadb-client-5-default\" (UID: \"cbb8d788-fb78-4ed5-8be2-7cfcfee11e77\") " pod="openstack/mariadb-client-5-default" Oct 06 16:12:27 crc kubenswrapper[4763]: I1006 16:12:27.411959 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 06 16:12:27 crc kubenswrapper[4763]: I1006 16:12:27.480176 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt258\" (UniqueName: \"kubernetes.io/projected/cbb8d788-fb78-4ed5-8be2-7cfcfee11e77-kube-api-access-dt258\") pod \"mariadb-client-5-default\" (UID: \"cbb8d788-fb78-4ed5-8be2-7cfcfee11e77\") " pod="openstack/mariadb-client-5-default" Oct 06 16:12:27 crc kubenswrapper[4763]: I1006 16:12:27.503947 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt258\" (UniqueName: \"kubernetes.io/projected/cbb8d788-fb78-4ed5-8be2-7cfcfee11e77-kube-api-access-dt258\") pod \"mariadb-client-5-default\" (UID: \"cbb8d788-fb78-4ed5-8be2-7cfcfee11e77\") " pod="openstack/mariadb-client-5-default" Oct 06 16:12:27 crc kubenswrapper[4763]: I1006 16:12:27.717922 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 06 16:12:28 crc kubenswrapper[4763]: I1006 16:12:28.029373 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 06 16:12:28 crc kubenswrapper[4763]: W1006 16:12:28.038351 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbb8d788_fb78_4ed5_8be2_7cfcfee11e77.slice/crio-15066732897c6ce33123162130ac60328ac56057032c50d8facf13c0374cbffc WatchSource:0}: Error finding container 15066732897c6ce33123162130ac60328ac56057032c50d8facf13c0374cbffc: Status 404 returned error can't find the container with id 15066732897c6ce33123162130ac60328ac56057032c50d8facf13c0374cbffc Oct 06 16:12:28 crc kubenswrapper[4763]: I1006 16:12:28.903548 4763 generic.go:334] "Generic (PLEG): container finished" podID="cbb8d788-fb78-4ed5-8be2-7cfcfee11e77" containerID="9da989787b4d480bbae672c943b007c6a7488de37cac898f17a23a1e8beacb87" exitCode=0 Oct 06 16:12:28 crc kubenswrapper[4763]: I1006 16:12:28.903652 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"cbb8d788-fb78-4ed5-8be2-7cfcfee11e77","Type":"ContainerDied","Data":"9da989787b4d480bbae672c943b007c6a7488de37cac898f17a23a1e8beacb87"} Oct 06 16:12:28 crc kubenswrapper[4763]: I1006 16:12:28.903843 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"cbb8d788-fb78-4ed5-8be2-7cfcfee11e77","Type":"ContainerStarted","Data":"15066732897c6ce33123162130ac60328ac56057032c50d8facf13c0374cbffc"} Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.326854 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.351649 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_cbb8d788-fb78-4ed5-8be2-7cfcfee11e77/mariadb-client-5-default/0.log" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.380527 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.387442 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.427094 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt258\" (UniqueName: \"kubernetes.io/projected/cbb8d788-fb78-4ed5-8be2-7cfcfee11e77-kube-api-access-dt258\") pod \"cbb8d788-fb78-4ed5-8be2-7cfcfee11e77\" (UID: \"cbb8d788-fb78-4ed5-8be2-7cfcfee11e77\") " Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.434214 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb8d788-fb78-4ed5-8be2-7cfcfee11e77-kube-api-access-dt258" (OuterVolumeSpecName: "kube-api-access-dt258") pod "cbb8d788-fb78-4ed5-8be2-7cfcfee11e77" (UID: "cbb8d788-fb78-4ed5-8be2-7cfcfee11e77"). InnerVolumeSpecName "kube-api-access-dt258". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.498574 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 06 16:12:30 crc kubenswrapper[4763]: E1006 16:12:30.499170 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb8d788-fb78-4ed5-8be2-7cfcfee11e77" containerName="mariadb-client-5-default" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.499203 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb8d788-fb78-4ed5-8be2-7cfcfee11e77" containerName="mariadb-client-5-default" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.499508 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbb8d788-fb78-4ed5-8be2-7cfcfee11e77" containerName="mariadb-client-5-default" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.500428 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.506588 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.528451 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cxkd\" (UniqueName: \"kubernetes.io/projected/13cced74-66b5-48a9-9974-c04c325e13ca-kube-api-access-5cxkd\") pod \"mariadb-client-6-default\" (UID: \"13cced74-66b5-48a9-9974-c04c325e13ca\") " pod="openstack/mariadb-client-6-default" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.528647 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt258\" (UniqueName: \"kubernetes.io/projected/cbb8d788-fb78-4ed5-8be2-7cfcfee11e77-kube-api-access-dt258\") on node \"crc\" DevicePath \"\"" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.630191 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cxkd\" (UniqueName: \"kubernetes.io/projected/13cced74-66b5-48a9-9974-c04c325e13ca-kube-api-access-5cxkd\") pod \"mariadb-client-6-default\" (UID: \"13cced74-66b5-48a9-9974-c04c325e13ca\") " pod="openstack/mariadb-client-6-default" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.649423 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cxkd\" (UniqueName: \"kubernetes.io/projected/13cced74-66b5-48a9-9974-c04c325e13ca-kube-api-access-5cxkd\") pod \"mariadb-client-6-default\" (UID: \"13cced74-66b5-48a9-9974-c04c325e13ca\") " pod="openstack/mariadb-client-6-default" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.822915 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.929736 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15066732897c6ce33123162130ac60328ac56057032c50d8facf13c0374cbffc" Oct 06 16:12:30 crc kubenswrapper[4763]: I1006 16:12:30.929987 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 06 16:12:31 crc kubenswrapper[4763]: I1006 16:12:31.433989 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 06 16:12:31 crc kubenswrapper[4763]: I1006 16:12:31.591369 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbb8d788-fb78-4ed5-8be2-7cfcfee11e77" path="/var/lib/kubelet/pods/cbb8d788-fb78-4ed5-8be2-7cfcfee11e77/volumes" Oct 06 16:12:31 crc kubenswrapper[4763]: I1006 16:12:31.942417 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"13cced74-66b5-48a9-9974-c04c325e13ca","Type":"ContainerStarted","Data":"ccd862a1b0624e0f932212c354f37fb049363a038d98682928750992554b2c70"} Oct 06 16:12:31 crc kubenswrapper[4763]: I1006 16:12:31.942724 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"13cced74-66b5-48a9-9974-c04c325e13ca","Type":"ContainerStarted","Data":"ecf90e5398aab581abe06530c6738813d31072f4a63081e748abac0bd2fe19fb"} Oct 06 16:12:31 crc kubenswrapper[4763]: I1006 16:12:31.968380 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.968311851 podStartE2EDuration="1.968311851s" podCreationTimestamp="2025-10-06 16:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:12:31.959141911 +0000 UTC m=+4749.114434463" watchObservedRunningTime="2025-10-06 16:12:31.968311851 +0000 UTC m=+4749.123604403" Oct 06 16:12:32 crc kubenswrapper[4763]: I1006 16:12:32.575223 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:12:32 crc kubenswrapper[4763]: E1006 16:12:32.575885 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:12:32 crc kubenswrapper[4763]: I1006 16:12:32.951309 4763 generic.go:334] "Generic (PLEG): container finished" podID="13cced74-66b5-48a9-9974-c04c325e13ca" containerID="ccd862a1b0624e0f932212c354f37fb049363a038d98682928750992554b2c70" exitCode=0 Oct 06 16:12:32 crc kubenswrapper[4763]: I1006 16:12:32.951351 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"13cced74-66b5-48a9-9974-c04c325e13ca","Type":"ContainerDied","Data":"ccd862a1b0624e0f932212c354f37fb049363a038d98682928750992554b2c70"} Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.467234 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.486672 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cxkd\" (UniqueName: \"kubernetes.io/projected/13cced74-66b5-48a9-9974-c04c325e13ca-kube-api-access-5cxkd\") pod \"13cced74-66b5-48a9-9974-c04c325e13ca\" (UID: \"13cced74-66b5-48a9-9974-c04c325e13ca\") " Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.494654 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13cced74-66b5-48a9-9974-c04c325e13ca-kube-api-access-5cxkd" (OuterVolumeSpecName: "kube-api-access-5cxkd") pod "13cced74-66b5-48a9-9974-c04c325e13ca" (UID: "13cced74-66b5-48a9-9974-c04c325e13ca"). InnerVolumeSpecName "kube-api-access-5cxkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.506042 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.515498 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.588220 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cxkd\" (UniqueName: \"kubernetes.io/projected/13cced74-66b5-48a9-9974-c04c325e13ca-kube-api-access-5cxkd\") on node \"crc\" DevicePath \"\"" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.626801 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 06 16:12:34 crc kubenswrapper[4763]: E1006 16:12:34.627111 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cced74-66b5-48a9-9974-c04c325e13ca" containerName="mariadb-client-6-default" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.627135 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cced74-66b5-48a9-9974-c04c325e13ca" containerName="mariadb-client-6-default" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.627294 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="13cced74-66b5-48a9-9974-c04c325e13ca" containerName="mariadb-client-6-default" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.627795 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.634068 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.690008 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2ggs\" (UniqueName: \"kubernetes.io/projected/e1605e67-ce18-41cc-9fa3-766e5fb54271-kube-api-access-d2ggs\") pod \"mariadb-client-7-default\" (UID: \"e1605e67-ce18-41cc-9fa3-766e5fb54271\") " pod="openstack/mariadb-client-7-default" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.792050 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2ggs\" (UniqueName: \"kubernetes.io/projected/e1605e67-ce18-41cc-9fa3-766e5fb54271-kube-api-access-d2ggs\") pod \"mariadb-client-7-default\" (UID: \"e1605e67-ce18-41cc-9fa3-766e5fb54271\") " pod="openstack/mariadb-client-7-default" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.816512 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2ggs\" (UniqueName: \"kubernetes.io/projected/e1605e67-ce18-41cc-9fa3-766e5fb54271-kube-api-access-d2ggs\") pod \"mariadb-client-7-default\" (UID: \"e1605e67-ce18-41cc-9fa3-766e5fb54271\") " pod="openstack/mariadb-client-7-default" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.946986 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.982741 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecf90e5398aab581abe06530c6738813d31072f4a63081e748abac0bd2fe19fb" Oct 06 16:12:34 crc kubenswrapper[4763]: I1006 16:12:34.982858 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 06 16:12:35 crc kubenswrapper[4763]: I1006 16:12:35.252016 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 06 16:12:35 crc kubenswrapper[4763]: I1006 16:12:35.586770 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13cced74-66b5-48a9-9974-c04c325e13ca" path="/var/lib/kubelet/pods/13cced74-66b5-48a9-9974-c04c325e13ca/volumes" Oct 06 16:12:35 crc kubenswrapper[4763]: I1006 16:12:35.990514 4763 generic.go:334] "Generic (PLEG): container finished" podID="e1605e67-ce18-41cc-9fa3-766e5fb54271" containerID="a1db71a919c4cf33ef77e745a6583338916f9d1e5cab515fdb20189218069055" exitCode=0 Oct 06 16:12:35 crc kubenswrapper[4763]: I1006 16:12:35.990551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"e1605e67-ce18-41cc-9fa3-766e5fb54271","Type":"ContainerDied","Data":"a1db71a919c4cf33ef77e745a6583338916f9d1e5cab515fdb20189218069055"} Oct 06 16:12:35 crc kubenswrapper[4763]: I1006 16:12:35.990574 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"e1605e67-ce18-41cc-9fa3-766e5fb54271","Type":"ContainerStarted","Data":"10bdd72b72d4d7b4d865c67f730a1c0c1a86054ce46ae9ae0de6ae96e91c96f5"} Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.456885 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.504004 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_e1605e67-ce18-41cc-9fa3-766e5fb54271/mariadb-client-7-default/0.log" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.546814 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.546869 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.636121 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2ggs\" (UniqueName: \"kubernetes.io/projected/e1605e67-ce18-41cc-9fa3-766e5fb54271-kube-api-access-d2ggs\") pod \"e1605e67-ce18-41cc-9fa3-766e5fb54271\" (UID: \"e1605e67-ce18-41cc-9fa3-766e5fb54271\") " Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.660866 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 06 16:12:37 crc kubenswrapper[4763]: E1006 16:12:37.661267 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1605e67-ce18-41cc-9fa3-766e5fb54271" containerName="mariadb-client-7-default" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.661290 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1605e67-ce18-41cc-9fa3-766e5fb54271" containerName="mariadb-client-7-default" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.661468 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1605e67-ce18-41cc-9fa3-766e5fb54271" containerName="mariadb-client-7-default" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.662105 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.675861 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.738273 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6fg6\" (UniqueName: \"kubernetes.io/projected/9811934b-3e02-42c6-a214-e47c5ea3a0a3-kube-api-access-g6fg6\") pod \"mariadb-client-2\" (UID: \"9811934b-3e02-42c6-a214-e47c5ea3a0a3\") " pod="openstack/mariadb-client-2" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.840430 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6fg6\" (UniqueName: \"kubernetes.io/projected/9811934b-3e02-42c6-a214-e47c5ea3a0a3-kube-api-access-g6fg6\") pod \"mariadb-client-2\" (UID: \"9811934b-3e02-42c6-a214-e47c5ea3a0a3\") " pod="openstack/mariadb-client-2" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.866607 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1605e67-ce18-41cc-9fa3-766e5fb54271-kube-api-access-d2ggs" (OuterVolumeSpecName: "kube-api-access-d2ggs") pod "e1605e67-ce18-41cc-9fa3-766e5fb54271" (UID: "e1605e67-ce18-41cc-9fa3-766e5fb54271"). InnerVolumeSpecName "kube-api-access-d2ggs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.867831 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6fg6\" (UniqueName: \"kubernetes.io/projected/9811934b-3e02-42c6-a214-e47c5ea3a0a3-kube-api-access-g6fg6\") pod \"mariadb-client-2\" (UID: \"9811934b-3e02-42c6-a214-e47c5ea3a0a3\") " pod="openstack/mariadb-client-2" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.941965 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2ggs\" (UniqueName: \"kubernetes.io/projected/e1605e67-ce18-41cc-9fa3-766e5fb54271-kube-api-access-d2ggs\") on node \"crc\" DevicePath \"\"" Oct 06 16:12:37 crc kubenswrapper[4763]: I1006 16:12:37.987076 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 06 16:12:38 crc kubenswrapper[4763]: I1006 16:12:38.013236 4763 scope.go:117] "RemoveContainer" containerID="a1db71a919c4cf33ef77e745a6583338916f9d1e5cab515fdb20189218069055" Oct 06 16:12:38 crc kubenswrapper[4763]: I1006 16:12:38.013313 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 06 16:12:38 crc kubenswrapper[4763]: I1006 16:12:38.411859 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 06 16:12:39 crc kubenswrapper[4763]: I1006 16:12:39.026855 4763 generic.go:334] "Generic (PLEG): container finished" podID="9811934b-3e02-42c6-a214-e47c5ea3a0a3" containerID="48349448639c3511b91d16da3e001d9735b77c7e4ec6cb1251776ab018e38955" exitCode=0 Oct 06 16:12:39 crc kubenswrapper[4763]: I1006 16:12:39.027349 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"9811934b-3e02-42c6-a214-e47c5ea3a0a3","Type":"ContainerDied","Data":"48349448639c3511b91d16da3e001d9735b77c7e4ec6cb1251776ab018e38955"} Oct 06 16:12:39 crc kubenswrapper[4763]: I1006 16:12:39.027403 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"9811934b-3e02-42c6-a214-e47c5ea3a0a3","Type":"ContainerStarted","Data":"e62c9a90b6efe2a9352221e577dfa74eb5e150700d7a2b56a886d40df232d0ec"} Oct 06 16:12:39 crc kubenswrapper[4763]: I1006 16:12:39.593065 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1605e67-ce18-41cc-9fa3-766e5fb54271" path="/var/lib/kubelet/pods/e1605e67-ce18-41cc-9fa3-766e5fb54271/volumes" Oct 06 16:12:40 crc kubenswrapper[4763]: I1006 16:12:40.416609 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 06 16:12:40 crc kubenswrapper[4763]: I1006 16:12:40.438654 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_9811934b-3e02-42c6-a214-e47c5ea3a0a3/mariadb-client-2/0.log" Oct 06 16:12:40 crc kubenswrapper[4763]: I1006 16:12:40.463734 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 06 16:12:40 crc kubenswrapper[4763]: I1006 16:12:40.470269 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 06 16:12:40 crc kubenswrapper[4763]: I1006 16:12:40.488940 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6fg6\" (UniqueName: \"kubernetes.io/projected/9811934b-3e02-42c6-a214-e47c5ea3a0a3-kube-api-access-g6fg6\") pod \"9811934b-3e02-42c6-a214-e47c5ea3a0a3\" (UID: \"9811934b-3e02-42c6-a214-e47c5ea3a0a3\") " Oct 06 16:12:40 crc kubenswrapper[4763]: I1006 16:12:40.495439 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9811934b-3e02-42c6-a214-e47c5ea3a0a3-kube-api-access-g6fg6" (OuterVolumeSpecName: "kube-api-access-g6fg6") pod "9811934b-3e02-42c6-a214-e47c5ea3a0a3" (UID: "9811934b-3e02-42c6-a214-e47c5ea3a0a3"). InnerVolumeSpecName "kube-api-access-g6fg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:12:40 crc kubenswrapper[4763]: I1006 16:12:40.590976 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6fg6\" (UniqueName: \"kubernetes.io/projected/9811934b-3e02-42c6-a214-e47c5ea3a0a3-kube-api-access-g6fg6\") on node \"crc\" DevicePath \"\"" Oct 06 16:12:41 crc kubenswrapper[4763]: I1006 16:12:41.053342 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e62c9a90b6efe2a9352221e577dfa74eb5e150700d7a2b56a886d40df232d0ec" Oct 06 16:12:41 crc kubenswrapper[4763]: I1006 16:12:41.053418 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 06 16:12:41 crc kubenswrapper[4763]: I1006 16:12:41.594122 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9811934b-3e02-42c6-a214-e47c5ea3a0a3" path="/var/lib/kubelet/pods/9811934b-3e02-42c6-a214-e47c5ea3a0a3/volumes" Oct 06 16:12:43 crc kubenswrapper[4763]: I1006 16:12:43.579257 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:12:43 crc kubenswrapper[4763]: E1006 16:12:43.579775 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:12:57 crc kubenswrapper[4763]: I1006 16:12:57.574474 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:12:57 crc kubenswrapper[4763]: E1006 16:12:57.576576 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:13:09 crc kubenswrapper[4763]: I1006 16:13:09.580967 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:13:09 crc kubenswrapper[4763]: E1006 16:13:09.582077 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:13:21 crc kubenswrapper[4763]: I1006 16:13:21.575175 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:13:21 crc kubenswrapper[4763]: E1006 16:13:21.576004 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:13:29 crc kubenswrapper[4763]: I1006 16:13:29.004930 4763 scope.go:117] "RemoveContainer" containerID="866c5b5e21dcca95359e07dfea280f160fe712da6cecf10826d19fd6738a6a45" Oct 06 16:13:32 crc kubenswrapper[4763]: I1006 16:13:32.575602 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:13:32 crc kubenswrapper[4763]: E1006 16:13:32.576467 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:13:43 crc kubenswrapper[4763]: I1006 16:13:43.575870 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:13:43 crc kubenswrapper[4763]: E1006 16:13:43.576698 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:13:54 crc kubenswrapper[4763]: I1006 16:13:54.575609 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:13:54 crc kubenswrapper[4763]: E1006 16:13:54.576297 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:14:06 crc kubenswrapper[4763]: I1006 16:14:06.576267 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:14:06 crc kubenswrapper[4763]: E1006 16:14:06.577435 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:14:19 crc kubenswrapper[4763]: I1006 16:14:19.575076 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:14:19 crc kubenswrapper[4763]: E1006 16:14:19.577039 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:14:32 crc kubenswrapper[4763]: I1006 16:14:32.574884 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:14:32 crc kubenswrapper[4763]: E1006 16:14:32.575536 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:14:46 crc kubenswrapper[4763]: I1006 16:14:46.575414 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:14:47 crc kubenswrapper[4763]: I1006 16:14:47.277229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"19a7c73fae42a6138f9b916df098fa093d1f8b04e8954f8f59d3dfa210830e42"} Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.163296 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj"] Oct 06 16:15:00 crc kubenswrapper[4763]: E1006 16:15:00.164815 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9811934b-3e02-42c6-a214-e47c5ea3a0a3" containerName="mariadb-client-2" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.164847 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9811934b-3e02-42c6-a214-e47c5ea3a0a3" containerName="mariadb-client-2" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.165223 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9811934b-3e02-42c6-a214-e47c5ea3a0a3" containerName="mariadb-client-2" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.166401 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.168659 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.169396 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.190127 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj"] Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.249096 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a955b364-4571-4269-80f2-66838a8d8303-config-volume\") pod \"collect-profiles-29329455-5jwhj\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.249482 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2blhl\" (UniqueName: \"kubernetes.io/projected/a955b364-4571-4269-80f2-66838a8d8303-kube-api-access-2blhl\") pod \"collect-profiles-29329455-5jwhj\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.249802 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a955b364-4571-4269-80f2-66838a8d8303-secret-volume\") pod \"collect-profiles-29329455-5jwhj\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.352236 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a955b364-4571-4269-80f2-66838a8d8303-config-volume\") pod \"collect-profiles-29329455-5jwhj\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.352349 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2blhl\" (UniqueName: \"kubernetes.io/projected/a955b364-4571-4269-80f2-66838a8d8303-kube-api-access-2blhl\") pod \"collect-profiles-29329455-5jwhj\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.352422 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a955b364-4571-4269-80f2-66838a8d8303-secret-volume\") pod \"collect-profiles-29329455-5jwhj\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.353195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a955b364-4571-4269-80f2-66838a8d8303-config-volume\") pod \"collect-profiles-29329455-5jwhj\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.366980 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a955b364-4571-4269-80f2-66838a8d8303-secret-volume\") pod \"collect-profiles-29329455-5jwhj\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.466055 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2blhl\" (UniqueName: \"kubernetes.io/projected/a955b364-4571-4269-80f2-66838a8d8303-kube-api-access-2blhl\") pod \"collect-profiles-29329455-5jwhj\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.489961 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:00 crc kubenswrapper[4763]: I1006 16:15:00.789092 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj"] Oct 06 16:15:01 crc kubenswrapper[4763]: I1006 16:15:01.418822 4763 generic.go:334] "Generic (PLEG): container finished" podID="a955b364-4571-4269-80f2-66838a8d8303" containerID="250e16f53888ff5c48d86b6b9b48112889c2575631ca3778245e6472b34ce0ee" exitCode=0 Oct 06 16:15:01 crc kubenswrapper[4763]: I1006 16:15:01.418868 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" event={"ID":"a955b364-4571-4269-80f2-66838a8d8303","Type":"ContainerDied","Data":"250e16f53888ff5c48d86b6b9b48112889c2575631ca3778245e6472b34ce0ee"} Oct 06 16:15:01 crc kubenswrapper[4763]: I1006 16:15:01.419169 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" event={"ID":"a955b364-4571-4269-80f2-66838a8d8303","Type":"ContainerStarted","Data":"fc4547751a681fd34e86e90ca82d3d7b68b6f6c04221088fb44a1b25d7779d4f"} Oct 06 16:15:02 crc kubenswrapper[4763]: I1006 16:15:02.722127 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:02 crc kubenswrapper[4763]: I1006 16:15:02.791113 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2blhl\" (UniqueName: \"kubernetes.io/projected/a955b364-4571-4269-80f2-66838a8d8303-kube-api-access-2blhl\") pod \"a955b364-4571-4269-80f2-66838a8d8303\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " Oct 06 16:15:02 crc kubenswrapper[4763]: I1006 16:15:02.791298 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a955b364-4571-4269-80f2-66838a8d8303-secret-volume\") pod \"a955b364-4571-4269-80f2-66838a8d8303\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " Oct 06 16:15:02 crc kubenswrapper[4763]: I1006 16:15:02.791332 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a955b364-4571-4269-80f2-66838a8d8303-config-volume\") pod \"a955b364-4571-4269-80f2-66838a8d8303\" (UID: \"a955b364-4571-4269-80f2-66838a8d8303\") " Oct 06 16:15:02 crc kubenswrapper[4763]: I1006 16:15:02.791951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a955b364-4571-4269-80f2-66838a8d8303-config-volume" (OuterVolumeSpecName: "config-volume") pod "a955b364-4571-4269-80f2-66838a8d8303" (UID: "a955b364-4571-4269-80f2-66838a8d8303"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:15:02 crc kubenswrapper[4763]: I1006 16:15:02.796775 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a955b364-4571-4269-80f2-66838a8d8303-kube-api-access-2blhl" (OuterVolumeSpecName: "kube-api-access-2blhl") pod "a955b364-4571-4269-80f2-66838a8d8303" (UID: "a955b364-4571-4269-80f2-66838a8d8303"). InnerVolumeSpecName "kube-api-access-2blhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:15:02 crc kubenswrapper[4763]: I1006 16:15:02.796932 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a955b364-4571-4269-80f2-66838a8d8303-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a955b364-4571-4269-80f2-66838a8d8303" (UID: "a955b364-4571-4269-80f2-66838a8d8303"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:15:02 crc kubenswrapper[4763]: I1006 16:15:02.893117 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a955b364-4571-4269-80f2-66838a8d8303-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 16:15:02 crc kubenswrapper[4763]: I1006 16:15:02.893197 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a955b364-4571-4269-80f2-66838a8d8303-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 16:15:02 crc kubenswrapper[4763]: I1006 16:15:02.893222 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2blhl\" (UniqueName: \"kubernetes.io/projected/a955b364-4571-4269-80f2-66838a8d8303-kube-api-access-2blhl\") on node \"crc\" DevicePath \"\"" Oct 06 16:15:03 crc kubenswrapper[4763]: I1006 16:15:03.441287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" event={"ID":"a955b364-4571-4269-80f2-66838a8d8303","Type":"ContainerDied","Data":"fc4547751a681fd34e86e90ca82d3d7b68b6f6c04221088fb44a1b25d7779d4f"} Oct 06 16:15:03 crc kubenswrapper[4763]: I1006 16:15:03.441359 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc4547751a681fd34e86e90ca82d3d7b68b6f6c04221088fb44a1b25d7779d4f" Oct 06 16:15:03 crc kubenswrapper[4763]: I1006 16:15:03.441419 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj" Oct 06 16:15:03 crc kubenswrapper[4763]: I1006 16:15:03.802718 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc"] Oct 06 16:15:03 crc kubenswrapper[4763]: I1006 16:15:03.808849 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329410-v45wc"] Oct 06 16:15:05 crc kubenswrapper[4763]: I1006 16:15:05.591553 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c609b5b-2ad2-4145-a3e9-1fedbde830d8" path="/var/lib/kubelet/pods/9c609b5b-2ad2-4145-a3e9-1fedbde830d8/volumes" Oct 06 16:15:29 crc kubenswrapper[4763]: I1006 16:15:29.118128 4763 scope.go:117] "RemoveContainer" containerID="614d09c275e5c70b5c2676c37afc41016169eb9769d22d49017bcd07ed1fe108" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.627955 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w8nvk"] Oct 06 16:16:02 crc kubenswrapper[4763]: E1006 16:16:02.628604 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a955b364-4571-4269-80f2-66838a8d8303" containerName="collect-profiles" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.628628 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a955b364-4571-4269-80f2-66838a8d8303" containerName="collect-profiles" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.628766 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a955b364-4571-4269-80f2-66838a8d8303" containerName="collect-profiles" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.629682 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.648519 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w8nvk"] Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.666594 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zljrv\" (UniqueName: \"kubernetes.io/projected/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-kube-api-access-zljrv\") pod \"certified-operators-w8nvk\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.666697 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-catalog-content\") pod \"certified-operators-w8nvk\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.666743 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-utilities\") pod \"certified-operators-w8nvk\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.767790 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-catalog-content\") pod \"certified-operators-w8nvk\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.767863 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-utilities\") pod \"certified-operators-w8nvk\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.767929 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zljrv\" (UniqueName: \"kubernetes.io/projected/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-kube-api-access-zljrv\") pod \"certified-operators-w8nvk\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.768437 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-catalog-content\") pod \"certified-operators-w8nvk\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.768455 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-utilities\") pod \"certified-operators-w8nvk\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.786161 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zljrv\" (UniqueName: \"kubernetes.io/projected/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-kube-api-access-zljrv\") pod \"certified-operators-w8nvk\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:02 crc kubenswrapper[4763]: I1006 16:16:02.945071 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:03 crc kubenswrapper[4763]: I1006 16:16:03.420566 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w8nvk"] Oct 06 16:16:03 crc kubenswrapper[4763]: I1006 16:16:03.829208 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zr4px"] Oct 06 16:16:03 crc kubenswrapper[4763]: I1006 16:16:03.833208 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:03 crc kubenswrapper[4763]: I1006 16:16:03.844076 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zr4px"] Oct 06 16:16:03 crc kubenswrapper[4763]: I1006 16:16:03.982316 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-catalog-content\") pod \"community-operators-zr4px\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:03 crc kubenswrapper[4763]: I1006 16:16:03.982478 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-utilities\") pod \"community-operators-zr4px\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:03 crc kubenswrapper[4763]: I1006 16:16:03.982883 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rksx\" (UniqueName: \"kubernetes.io/projected/643f1f67-a16d-4a0e-abaa-9e832c2ae013-kube-api-access-6rksx\") pod \"community-operators-zr4px\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.019884 4763 generic.go:334] "Generic (PLEG): container finished" podID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerID="1d277ee9e996ae4f1932ca33e43ab3276a84f439ae4af58fee13801744d50c98" exitCode=0 Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.019934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8nvk" event={"ID":"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5","Type":"ContainerDied","Data":"1d277ee9e996ae4f1932ca33e43ab3276a84f439ae4af58fee13801744d50c98"} Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.019963 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8nvk" event={"ID":"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5","Type":"ContainerStarted","Data":"0d4642d8791a688059019ba614add96dd23e2383daa6af3ef1fa290cd748d76e"} Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.021689 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.084475 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rksx\" (UniqueName: \"kubernetes.io/projected/643f1f67-a16d-4a0e-abaa-9e832c2ae013-kube-api-access-6rksx\") pod \"community-operators-zr4px\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.084546 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-catalog-content\") pod \"community-operators-zr4px\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.084590 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-utilities\") pod \"community-operators-zr4px\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.085168 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-utilities\") pod \"community-operators-zr4px\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.085484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-catalog-content\") pod \"community-operators-zr4px\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.106066 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rksx\" (UniqueName: \"kubernetes.io/projected/643f1f67-a16d-4a0e-abaa-9e832c2ae013-kube-api-access-6rksx\") pod \"community-operators-zr4px\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.179196 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:04 crc kubenswrapper[4763]: I1006 16:16:04.673114 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zr4px"] Oct 06 16:16:04 crc kubenswrapper[4763]: W1006 16:16:04.675702 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod643f1f67_a16d_4a0e_abaa_9e832c2ae013.slice/crio-dec5ed5dc8643f71675672556bdc5af36811b3acf922ba310cd8a1d99c993597 WatchSource:0}: Error finding container dec5ed5dc8643f71675672556bdc5af36811b3acf922ba310cd8a1d99c993597: Status 404 returned error can't find the container with id dec5ed5dc8643f71675672556bdc5af36811b3acf922ba310cd8a1d99c993597 Oct 06 16:16:05 crc kubenswrapper[4763]: I1006 16:16:05.030679 4763 generic.go:334] "Generic (PLEG): container finished" podID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerID="44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c" exitCode=0 Oct 06 16:16:05 crc kubenswrapper[4763]: I1006 16:16:05.030835 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr4px" event={"ID":"643f1f67-a16d-4a0e-abaa-9e832c2ae013","Type":"ContainerDied","Data":"44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c"} Oct 06 16:16:05 crc kubenswrapper[4763]: I1006 16:16:05.031081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr4px" event={"ID":"643f1f67-a16d-4a0e-abaa-9e832c2ae013","Type":"ContainerStarted","Data":"dec5ed5dc8643f71675672556bdc5af36811b3acf922ba310cd8a1d99c993597"} Oct 06 16:16:05 crc kubenswrapper[4763]: I1006 16:16:05.035571 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8nvk" event={"ID":"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5","Type":"ContainerStarted","Data":"5493e7e290f5dbc2f88b6ed49c6e9eae3f0ba6fd2c156ce513af641a113e7ff2"} Oct 06 16:16:06 crc kubenswrapper[4763]: I1006 16:16:06.048741 4763 generic.go:334] "Generic (PLEG): container finished" podID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerID="5493e7e290f5dbc2f88b6ed49c6e9eae3f0ba6fd2c156ce513af641a113e7ff2" exitCode=0 Oct 06 16:16:06 crc kubenswrapper[4763]: I1006 16:16:06.048798 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8nvk" event={"ID":"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5","Type":"ContainerDied","Data":"5493e7e290f5dbc2f88b6ed49c6e9eae3f0ba6fd2c156ce513af641a113e7ff2"} Oct 06 16:16:06 crc kubenswrapper[4763]: I1006 16:16:06.054986 4763 generic.go:334] "Generic (PLEG): container finished" podID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerID="e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7" exitCode=0 Oct 06 16:16:06 crc kubenswrapper[4763]: I1006 16:16:06.055047 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr4px" event={"ID":"643f1f67-a16d-4a0e-abaa-9e832c2ae013","Type":"ContainerDied","Data":"e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7"} Oct 06 16:16:07 crc kubenswrapper[4763]: I1006 16:16:07.065557 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8nvk" event={"ID":"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5","Type":"ContainerStarted","Data":"5f4238c5744f76914037b7bda097c37ba29670d610cfc7dd76d26edb61235b09"} Oct 06 16:16:07 crc kubenswrapper[4763]: I1006 16:16:07.068574 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr4px" event={"ID":"643f1f67-a16d-4a0e-abaa-9e832c2ae013","Type":"ContainerStarted","Data":"8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e"} Oct 06 16:16:07 crc kubenswrapper[4763]: I1006 16:16:07.091781 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w8nvk" podStartSLOduration=2.561275597 podStartE2EDuration="5.091760806s" podCreationTimestamp="2025-10-06 16:16:02 +0000 UTC" firstStartedPulling="2025-10-06 16:16:04.021423233 +0000 UTC m=+4961.176715755" lastFinishedPulling="2025-10-06 16:16:06.551908412 +0000 UTC m=+4963.707200964" observedRunningTime="2025-10-06 16:16:07.085676883 +0000 UTC m=+4964.240969415" watchObservedRunningTime="2025-10-06 16:16:07.091760806 +0000 UTC m=+4964.247053338" Oct 06 16:16:07 crc kubenswrapper[4763]: I1006 16:16:07.105329 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zr4px" podStartSLOduration=2.548086288 podStartE2EDuration="4.105311679s" podCreationTimestamp="2025-10-06 16:16:03 +0000 UTC" firstStartedPulling="2025-10-06 16:16:05.033038618 +0000 UTC m=+4962.188331130" lastFinishedPulling="2025-10-06 16:16:06.590263969 +0000 UTC m=+4963.745556521" observedRunningTime="2025-10-06 16:16:07.10087709 +0000 UTC m=+4964.256169592" watchObservedRunningTime="2025-10-06 16:16:07.105311679 +0000 UTC m=+4964.260604191" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.626992 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7htsm"] Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.629184 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.658773 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7htsm"] Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.753043 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-catalog-content\") pod \"redhat-operators-7htsm\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.753149 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vcrz\" (UniqueName: \"kubernetes.io/projected/f376097e-23a1-430d-927b-af057072ac6b-kube-api-access-7vcrz\") pod \"redhat-operators-7htsm\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.753205 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-utilities\") pod \"redhat-operators-7htsm\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.854540 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-utilities\") pod \"redhat-operators-7htsm\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.854655 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-catalog-content\") pod \"redhat-operators-7htsm\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.854708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vcrz\" (UniqueName: \"kubernetes.io/projected/f376097e-23a1-430d-927b-af057072ac6b-kube-api-access-7vcrz\") pod \"redhat-operators-7htsm\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.855099 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-utilities\") pod \"redhat-operators-7htsm\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.855186 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-catalog-content\") pod \"redhat-operators-7htsm\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.876785 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vcrz\" (UniqueName: \"kubernetes.io/projected/f376097e-23a1-430d-927b-af057072ac6b-kube-api-access-7vcrz\") pod \"redhat-operators-7htsm\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:08 crc kubenswrapper[4763]: I1006 16:16:08.960575 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:09 crc kubenswrapper[4763]: I1006 16:16:09.440910 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7htsm"] Oct 06 16:16:09 crc kubenswrapper[4763]: W1006 16:16:09.464254 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf376097e_23a1_430d_927b_af057072ac6b.slice/crio-db27410dab7d212cb58e91c9f4eda23cbffe67b4245f72db9776f83ede4a189e WatchSource:0}: Error finding container db27410dab7d212cb58e91c9f4eda23cbffe67b4245f72db9776f83ede4a189e: Status 404 returned error can't find the container with id db27410dab7d212cb58e91c9f4eda23cbffe67b4245f72db9776f83ede4a189e Oct 06 16:16:10 crc kubenswrapper[4763]: I1006 16:16:10.097326 4763 generic.go:334] "Generic (PLEG): container finished" podID="f376097e-23a1-430d-927b-af057072ac6b" containerID="27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd" exitCode=0 Oct 06 16:16:10 crc kubenswrapper[4763]: I1006 16:16:10.097396 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7htsm" event={"ID":"f376097e-23a1-430d-927b-af057072ac6b","Type":"ContainerDied","Data":"27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd"} Oct 06 16:16:10 crc kubenswrapper[4763]: I1006 16:16:10.097469 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7htsm" event={"ID":"f376097e-23a1-430d-927b-af057072ac6b","Type":"ContainerStarted","Data":"db27410dab7d212cb58e91c9f4eda23cbffe67b4245f72db9776f83ede4a189e"} Oct 06 16:16:11 crc kubenswrapper[4763]: I1006 16:16:11.107092 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7htsm" event={"ID":"f376097e-23a1-430d-927b-af057072ac6b","Type":"ContainerStarted","Data":"3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15"} Oct 06 16:16:12 crc kubenswrapper[4763]: I1006 16:16:12.130579 4763 generic.go:334] "Generic (PLEG): container finished" podID="f376097e-23a1-430d-927b-af057072ac6b" containerID="3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15" exitCode=0 Oct 06 16:16:12 crc kubenswrapper[4763]: I1006 16:16:12.130888 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7htsm" event={"ID":"f376097e-23a1-430d-927b-af057072ac6b","Type":"ContainerDied","Data":"3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15"} Oct 06 16:16:12 crc kubenswrapper[4763]: I1006 16:16:12.945972 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:12 crc kubenswrapper[4763]: I1006 16:16:12.946320 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:13 crc kubenswrapper[4763]: I1006 16:16:13.003978 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:13 crc kubenswrapper[4763]: I1006 16:16:13.140567 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7htsm" event={"ID":"f376097e-23a1-430d-927b-af057072ac6b","Type":"ContainerStarted","Data":"55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f"} Oct 06 16:16:13 crc kubenswrapper[4763]: I1006 16:16:13.171420 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7htsm" podStartSLOduration=2.701020637 podStartE2EDuration="5.171401606s" podCreationTimestamp="2025-10-06 16:16:08 +0000 UTC" firstStartedPulling="2025-10-06 16:16:10.09947571 +0000 UTC m=+4967.254768252" lastFinishedPulling="2025-10-06 16:16:12.569856709 +0000 UTC m=+4969.725149221" observedRunningTime="2025-10-06 16:16:13.166684559 +0000 UTC m=+4970.321977091" watchObservedRunningTime="2025-10-06 16:16:13.171401606 +0000 UTC m=+4970.326694108" Oct 06 16:16:13 crc kubenswrapper[4763]: I1006 16:16:13.188950 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:14 crc kubenswrapper[4763]: I1006 16:16:14.179476 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:14 crc kubenswrapper[4763]: I1006 16:16:14.179534 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:14 crc kubenswrapper[4763]: I1006 16:16:14.224126 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:15 crc kubenswrapper[4763]: I1006 16:16:15.215212 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:17 crc kubenswrapper[4763]: I1006 16:16:17.412051 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w8nvk"] Oct 06 16:16:17 crc kubenswrapper[4763]: I1006 16:16:17.412544 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w8nvk" podUID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerName="registry-server" containerID="cri-o://5f4238c5744f76914037b7bda097c37ba29670d610cfc7dd76d26edb61235b09" gracePeriod=2 Oct 06 16:16:18 crc kubenswrapper[4763]: I1006 16:16:18.961273 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:18 crc kubenswrapper[4763]: I1006 16:16:18.961322 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.022164 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.192389 4763 generic.go:334] "Generic (PLEG): container finished" podID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerID="5f4238c5744f76914037b7bda097c37ba29670d610cfc7dd76d26edb61235b09" exitCode=0 Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.192512 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8nvk" event={"ID":"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5","Type":"ContainerDied","Data":"5f4238c5744f76914037b7bda097c37ba29670d610cfc7dd76d26edb61235b09"} Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.219980 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zr4px"] Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.220206 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zr4px" podUID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerName="registry-server" containerID="cri-o://8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e" gracePeriod=2 Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.258328 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.708140 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.714682 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.829981 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-catalog-content\") pod \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.830104 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rksx\" (UniqueName: \"kubernetes.io/projected/643f1f67-a16d-4a0e-abaa-9e832c2ae013-kube-api-access-6rksx\") pod \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.830133 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-catalog-content\") pod \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.830187 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-utilities\") pod \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.830221 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-utilities\") pod \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\" (UID: \"643f1f67-a16d-4a0e-abaa-9e832c2ae013\") " Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.830249 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zljrv\" (UniqueName: \"kubernetes.io/projected/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-kube-api-access-zljrv\") pod \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\" (UID: \"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5\") " Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.830960 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-utilities" (OuterVolumeSpecName: "utilities") pod "1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" (UID: "1d81ee6a-1b07-4096-ba7a-c65cd68c57e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.831056 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-utilities" (OuterVolumeSpecName: "utilities") pod "643f1f67-a16d-4a0e-abaa-9e832c2ae013" (UID: "643f1f67-a16d-4a0e-abaa-9e832c2ae013"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.835914 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643f1f67-a16d-4a0e-abaa-9e832c2ae013-kube-api-access-6rksx" (OuterVolumeSpecName: "kube-api-access-6rksx") pod "643f1f67-a16d-4a0e-abaa-9e832c2ae013" (UID: "643f1f67-a16d-4a0e-abaa-9e832c2ae013"). InnerVolumeSpecName "kube-api-access-6rksx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.838979 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-kube-api-access-zljrv" (OuterVolumeSpecName: "kube-api-access-zljrv") pod "1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" (UID: "1d81ee6a-1b07-4096-ba7a-c65cd68c57e5"). InnerVolumeSpecName "kube-api-access-zljrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.885848 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" (UID: "1d81ee6a-1b07-4096-ba7a-c65cd68c57e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.892970 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "643f1f67-a16d-4a0e-abaa-9e832c2ae013" (UID: "643f1f67-a16d-4a0e-abaa-9e832c2ae013"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.931708 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.932034 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.932191 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zljrv\" (UniqueName: \"kubernetes.io/projected/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-kube-api-access-zljrv\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.932316 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/643f1f67-a16d-4a0e-abaa-9e832c2ae013-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.932455 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rksx\" (UniqueName: \"kubernetes.io/projected/643f1f67-a16d-4a0e-abaa-9e832c2ae013-kube-api-access-6rksx\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:19 crc kubenswrapper[4763]: I1006 16:16:19.932700 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.206017 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8nvk" event={"ID":"1d81ee6a-1b07-4096-ba7a-c65cd68c57e5","Type":"ContainerDied","Data":"0d4642d8791a688059019ba614add96dd23e2383daa6af3ef1fa290cd748d76e"} Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.206079 4763 scope.go:117] "RemoveContainer" containerID="5f4238c5744f76914037b7bda097c37ba29670d610cfc7dd76d26edb61235b09" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.206380 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8nvk" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.212629 4763 generic.go:334] "Generic (PLEG): container finished" podID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerID="8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e" exitCode=0 Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.212708 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr4px" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.212704 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr4px" event={"ID":"643f1f67-a16d-4a0e-abaa-9e832c2ae013","Type":"ContainerDied","Data":"8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e"} Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.212777 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr4px" event={"ID":"643f1f67-a16d-4a0e-abaa-9e832c2ae013","Type":"ContainerDied","Data":"dec5ed5dc8643f71675672556bdc5af36811b3acf922ba310cd8a1d99c993597"} Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.257303 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w8nvk"] Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.263574 4763 scope.go:117] "RemoveContainer" containerID="5493e7e290f5dbc2f88b6ed49c6e9eae3f0ba6fd2c156ce513af641a113e7ff2" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.265992 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w8nvk"] Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.272693 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zr4px"] Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.278121 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zr4px"] Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.294105 4763 scope.go:117] "RemoveContainer" containerID="1d277ee9e996ae4f1932ca33e43ab3276a84f439ae4af58fee13801744d50c98" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.322742 4763 scope.go:117] "RemoveContainer" containerID="8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.349066 4763 scope.go:117] "RemoveContainer" containerID="e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.394555 4763 scope.go:117] "RemoveContainer" containerID="44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.444403 4763 scope.go:117] "RemoveContainer" containerID="8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e" Oct 06 16:16:20 crc kubenswrapper[4763]: E1006 16:16:20.445100 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e\": container with ID starting with 8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e not found: ID does not exist" containerID="8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.445167 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e"} err="failed to get container status \"8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e\": rpc error: code = NotFound desc = could not find container \"8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e\": container with ID starting with 8b3405d6f2e1dcdddaec3fc2276b92d32e451a4fb3e9bab04e63dbaff946125e not found: ID does not exist" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.445212 4763 scope.go:117] "RemoveContainer" containerID="e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7" Oct 06 16:16:20 crc kubenswrapper[4763]: E1006 16:16:20.445842 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7\": container with ID starting with e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7 not found: ID does not exist" containerID="e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.445895 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7"} err="failed to get container status \"e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7\": rpc error: code = NotFound desc = could not find container \"e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7\": container with ID starting with e723d262abc457875be7d624908913fff7155b1b4920347d83ba801c630342c7 not found: ID does not exist" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.445930 4763 scope.go:117] "RemoveContainer" containerID="44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c" Oct 06 16:16:20 crc kubenswrapper[4763]: E1006 16:16:20.446412 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c\": container with ID starting with 44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c not found: ID does not exist" containerID="44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.446439 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c"} err="failed to get container status \"44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c\": rpc error: code = NotFound desc = could not find container \"44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c\": container with ID starting with 44a3140bdd9166f72f95b166905571cab366a2eb4a86e1bae4d249b4ff51001c not found: ID does not exist" Oct 06 16:16:20 crc kubenswrapper[4763]: I1006 16:16:20.614637 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7htsm"] Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.226712 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7htsm" podUID="f376097e-23a1-430d-927b-af057072ac6b" containerName="registry-server" containerID="cri-o://55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f" gracePeriod=2 Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.589932 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" path="/var/lib/kubelet/pods/1d81ee6a-1b07-4096-ba7a-c65cd68c57e5/volumes" Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.590911 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" path="/var/lib/kubelet/pods/643f1f67-a16d-4a0e-abaa-9e832c2ae013/volumes" Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.699191 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.875706 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-utilities\") pod \"f376097e-23a1-430d-927b-af057072ac6b\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.875799 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vcrz\" (UniqueName: \"kubernetes.io/projected/f376097e-23a1-430d-927b-af057072ac6b-kube-api-access-7vcrz\") pod \"f376097e-23a1-430d-927b-af057072ac6b\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.875956 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-catalog-content\") pod \"f376097e-23a1-430d-927b-af057072ac6b\" (UID: \"f376097e-23a1-430d-927b-af057072ac6b\") " Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.878078 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-utilities" (OuterVolumeSpecName: "utilities") pod "f376097e-23a1-430d-927b-af057072ac6b" (UID: "f376097e-23a1-430d-927b-af057072ac6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.884000 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f376097e-23a1-430d-927b-af057072ac6b-kube-api-access-7vcrz" (OuterVolumeSpecName: "kube-api-access-7vcrz") pod "f376097e-23a1-430d-927b-af057072ac6b" (UID: "f376097e-23a1-430d-927b-af057072ac6b"). InnerVolumeSpecName "kube-api-access-7vcrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.978370 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f376097e-23a1-430d-927b-af057072ac6b" (UID: "f376097e-23a1-430d-927b-af057072ac6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.978411 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:21 crc kubenswrapper[4763]: I1006 16:16:21.978746 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vcrz\" (UniqueName: \"kubernetes.io/projected/f376097e-23a1-430d-927b-af057072ac6b-kube-api-access-7vcrz\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.079997 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f376097e-23a1-430d-927b-af057072ac6b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.238820 4763 generic.go:334] "Generic (PLEG): container finished" podID="f376097e-23a1-430d-927b-af057072ac6b" containerID="55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f" exitCode=0 Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.238908 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7htsm" event={"ID":"f376097e-23a1-430d-927b-af057072ac6b","Type":"ContainerDied","Data":"55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f"} Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.238946 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7htsm" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.238978 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7htsm" event={"ID":"f376097e-23a1-430d-927b-af057072ac6b","Type":"ContainerDied","Data":"db27410dab7d212cb58e91c9f4eda23cbffe67b4245f72db9776f83ede4a189e"} Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.239013 4763 scope.go:117] "RemoveContainer" containerID="55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.268500 4763 scope.go:117] "RemoveContainer" containerID="3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.286057 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7htsm"] Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.290537 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7htsm"] Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.312821 4763 scope.go:117] "RemoveContainer" containerID="27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.330301 4763 scope.go:117] "RemoveContainer" containerID="55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f" Oct 06 16:16:22 crc kubenswrapper[4763]: E1006 16:16:22.330756 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f\": container with ID starting with 55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f not found: ID does not exist" containerID="55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.330800 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f"} err="failed to get container status \"55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f\": rpc error: code = NotFound desc = could not find container \"55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f\": container with ID starting with 55726cdbb00955d7ca2e2cc3381a61e483bf54c29a46cb2a9ee6dba4a530727f not found: ID does not exist" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.330835 4763 scope.go:117] "RemoveContainer" containerID="3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15" Oct 06 16:16:22 crc kubenswrapper[4763]: E1006 16:16:22.331346 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15\": container with ID starting with 3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15 not found: ID does not exist" containerID="3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.331396 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15"} err="failed to get container status \"3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15\": rpc error: code = NotFound desc = could not find container \"3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15\": container with ID starting with 3d1de37157a680c8e75de7c5341a15db473830779342888ab114b6e37aabde15 not found: ID does not exist" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.331424 4763 scope.go:117] "RemoveContainer" containerID="27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd" Oct 06 16:16:22 crc kubenswrapper[4763]: E1006 16:16:22.331830 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd\": container with ID starting with 27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd not found: ID does not exist" containerID="27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd" Oct 06 16:16:22 crc kubenswrapper[4763]: I1006 16:16:22.331861 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd"} err="failed to get container status \"27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd\": rpc error: code = NotFound desc = could not find container \"27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd\": container with ID starting with 27e33d6b0916cee4786e4315a7c6c2b1e45d39ac54c5110088a0e434160362dd not found: ID does not exist" Oct 06 16:16:23 crc kubenswrapper[4763]: I1006 16:16:23.586468 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f376097e-23a1-430d-927b-af057072ac6b" path="/var/lib/kubelet/pods/f376097e-23a1-430d-927b-af057072ac6b/volumes" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.317787 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 06 16:16:50 crc kubenswrapper[4763]: E1006 16:16:50.318769 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerName="registry-server" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.318787 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerName="registry-server" Oct 06 16:16:50 crc kubenswrapper[4763]: E1006 16:16:50.318801 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerName="extract-content" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.318808 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerName="extract-content" Oct 06 16:16:50 crc kubenswrapper[4763]: E1006 16:16:50.318828 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f376097e-23a1-430d-927b-af057072ac6b" containerName="extract-utilities" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.318838 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f376097e-23a1-430d-927b-af057072ac6b" containerName="extract-utilities" Oct 06 16:16:50 crc kubenswrapper[4763]: E1006 16:16:50.318853 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerName="registry-server" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.318862 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerName="registry-server" Oct 06 16:16:50 crc kubenswrapper[4763]: E1006 16:16:50.318879 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerName="extract-utilities" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.318890 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerName="extract-utilities" Oct 06 16:16:50 crc kubenswrapper[4763]: E1006 16:16:50.318911 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerName="extract-content" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.318921 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerName="extract-content" Oct 06 16:16:50 crc kubenswrapper[4763]: E1006 16:16:50.318951 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f376097e-23a1-430d-927b-af057072ac6b" containerName="registry-server" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.318960 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f376097e-23a1-430d-927b-af057072ac6b" containerName="registry-server" Oct 06 16:16:50 crc kubenswrapper[4763]: E1006 16:16:50.318975 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerName="extract-utilities" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.318982 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerName="extract-utilities" Oct 06 16:16:50 crc kubenswrapper[4763]: E1006 16:16:50.318996 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f376097e-23a1-430d-927b-af057072ac6b" containerName="extract-content" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.319006 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f376097e-23a1-430d-927b-af057072ac6b" containerName="extract-content" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.319182 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f376097e-23a1-430d-927b-af057072ac6b" containerName="registry-server" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.319206 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d81ee6a-1b07-4096-ba7a-c65cd68c57e5" containerName="registry-server" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.319226 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="643f1f67-a16d-4a0e-abaa-9e832c2ae013" containerName="registry-server" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.319892 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.322730 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r7b94" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.329782 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.465780 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-98bf474b-40cb-4d45-821f-6385dcb9be64\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98bf474b-40cb-4d45-821f-6385dcb9be64\") pod \"mariadb-copy-data\" (UID: \"85cf1d75-f8dd-4ab2-8c67-f3622e156c38\") " pod="openstack/mariadb-copy-data" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.466490 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48dfp\" (UniqueName: \"kubernetes.io/projected/85cf1d75-f8dd-4ab2-8c67-f3622e156c38-kube-api-access-48dfp\") pod \"mariadb-copy-data\" (UID: \"85cf1d75-f8dd-4ab2-8c67-f3622e156c38\") " pod="openstack/mariadb-copy-data" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.568430 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48dfp\" (UniqueName: \"kubernetes.io/projected/85cf1d75-f8dd-4ab2-8c67-f3622e156c38-kube-api-access-48dfp\") pod \"mariadb-copy-data\" (UID: \"85cf1d75-f8dd-4ab2-8c67-f3622e156c38\") " pod="openstack/mariadb-copy-data" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.568551 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-98bf474b-40cb-4d45-821f-6385dcb9be64\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98bf474b-40cb-4d45-821f-6385dcb9be64\") pod \"mariadb-copy-data\" (UID: \"85cf1d75-f8dd-4ab2-8c67-f3622e156c38\") " pod="openstack/mariadb-copy-data" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.572198 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.572245 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-98bf474b-40cb-4d45-821f-6385dcb9be64\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98bf474b-40cb-4d45-821f-6385dcb9be64\") pod \"mariadb-copy-data\" (UID: \"85cf1d75-f8dd-4ab2-8c67-f3622e156c38\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c051d432f7b3cd4482c16bb49597b8a7f66d1cff06d4e220dc6319c3cd14f92d/globalmount\"" pod="openstack/mariadb-copy-data" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.594805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48dfp\" (UniqueName: \"kubernetes.io/projected/85cf1d75-f8dd-4ab2-8c67-f3622e156c38-kube-api-access-48dfp\") pod \"mariadb-copy-data\" (UID: \"85cf1d75-f8dd-4ab2-8c67-f3622e156c38\") " pod="openstack/mariadb-copy-data" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.610985 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-98bf474b-40cb-4d45-821f-6385dcb9be64\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98bf474b-40cb-4d45-821f-6385dcb9be64\") pod \"mariadb-copy-data\" (UID: \"85cf1d75-f8dd-4ab2-8c67-f3622e156c38\") " pod="openstack/mariadb-copy-data" Oct 06 16:16:50 crc kubenswrapper[4763]: I1006 16:16:50.652908 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 06 16:16:51 crc kubenswrapper[4763]: I1006 16:16:51.218299 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 06 16:16:51 crc kubenswrapper[4763]: W1006 16:16:51.220035 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85cf1d75_f8dd_4ab2_8c67_f3622e156c38.slice/crio-73251dd33dd832b0b04a6ebbbf06028b39b9a1ef6a3344003097363602ce393c WatchSource:0}: Error finding container 73251dd33dd832b0b04a6ebbbf06028b39b9a1ef6a3344003097363602ce393c: Status 404 returned error can't find the container with id 73251dd33dd832b0b04a6ebbbf06028b39b9a1ef6a3344003097363602ce393c Oct 06 16:16:51 crc kubenswrapper[4763]: I1006 16:16:51.536538 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"85cf1d75-f8dd-4ab2-8c67-f3622e156c38","Type":"ContainerStarted","Data":"1539ccf191ab0875de769e41f81ed4f0911d5017c46d0467b2feebda8e892b8a"} Oct 06 16:16:51 crc kubenswrapper[4763]: I1006 16:16:51.536901 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"85cf1d75-f8dd-4ab2-8c67-f3622e156c38","Type":"ContainerStarted","Data":"73251dd33dd832b0b04a6ebbbf06028b39b9a1ef6a3344003097363602ce393c"} Oct 06 16:16:51 crc kubenswrapper[4763]: I1006 16:16:51.552823 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.5528051019999998 podStartE2EDuration="2.552805102s" podCreationTimestamp="2025-10-06 16:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:16:51.54786011 +0000 UTC m=+5008.703152622" watchObservedRunningTime="2025-10-06 16:16:51.552805102 +0000 UTC m=+5008.708097624" Oct 06 16:16:53 crc kubenswrapper[4763]: I1006 16:16:53.386899 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 06 16:16:53 crc kubenswrapper[4763]: I1006 16:16:53.388146 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 06 16:16:53 crc kubenswrapper[4763]: I1006 16:16:53.395287 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 06 16:16:53 crc kubenswrapper[4763]: I1006 16:16:53.541096 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd7kr\" (UniqueName: \"kubernetes.io/projected/2a8829d0-507a-4834-ae0d-9cc5fc9b988f-kube-api-access-jd7kr\") pod \"mariadb-client\" (UID: \"2a8829d0-507a-4834-ae0d-9cc5fc9b988f\") " pod="openstack/mariadb-client" Oct 06 16:16:53 crc kubenswrapper[4763]: I1006 16:16:53.643368 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd7kr\" (UniqueName: \"kubernetes.io/projected/2a8829d0-507a-4834-ae0d-9cc5fc9b988f-kube-api-access-jd7kr\") pod \"mariadb-client\" (UID: \"2a8829d0-507a-4834-ae0d-9cc5fc9b988f\") " pod="openstack/mariadb-client" Oct 06 16:16:53 crc kubenswrapper[4763]: I1006 16:16:53.667102 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd7kr\" (UniqueName: \"kubernetes.io/projected/2a8829d0-507a-4834-ae0d-9cc5fc9b988f-kube-api-access-jd7kr\") pod \"mariadb-client\" (UID: \"2a8829d0-507a-4834-ae0d-9cc5fc9b988f\") " pod="openstack/mariadb-client" Oct 06 16:16:53 crc kubenswrapper[4763]: I1006 16:16:53.720649 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 06 16:16:54 crc kubenswrapper[4763]: I1006 16:16:54.251589 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 06 16:16:54 crc kubenswrapper[4763]: W1006 16:16:54.255174 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8829d0_507a_4834_ae0d_9cc5fc9b988f.slice/crio-14647f2aad3d253e1c7fd251860c21f65c6d70746951915335842043abb6479a WatchSource:0}: Error finding container 14647f2aad3d253e1c7fd251860c21f65c6d70746951915335842043abb6479a: Status 404 returned error can't find the container with id 14647f2aad3d253e1c7fd251860c21f65c6d70746951915335842043abb6479a Oct 06 16:16:54 crc kubenswrapper[4763]: I1006 16:16:54.563539 4763 generic.go:334] "Generic (PLEG): container finished" podID="2a8829d0-507a-4834-ae0d-9cc5fc9b988f" containerID="f1f4d4b3e1cef47791ec883138f40cf366b55f78afd534d24b615083046aca47" exitCode=0 Oct 06 16:16:54 crc kubenswrapper[4763]: I1006 16:16:54.563581 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2a8829d0-507a-4834-ae0d-9cc5fc9b988f","Type":"ContainerDied","Data":"f1f4d4b3e1cef47791ec883138f40cf366b55f78afd534d24b615083046aca47"} Oct 06 16:16:54 crc kubenswrapper[4763]: I1006 16:16:54.563605 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2a8829d0-507a-4834-ae0d-9cc5fc9b988f","Type":"ContainerStarted","Data":"14647f2aad3d253e1c7fd251860c21f65c6d70746951915335842043abb6479a"} Oct 06 16:16:55 crc kubenswrapper[4763]: I1006 16:16:55.869749 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 06 16:16:55 crc kubenswrapper[4763]: I1006 16:16:55.892051 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_2a8829d0-507a-4834-ae0d-9cc5fc9b988f/mariadb-client/0.log" Oct 06 16:16:55 crc kubenswrapper[4763]: I1006 16:16:55.912313 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 06 16:16:55 crc kubenswrapper[4763]: I1006 16:16:55.919018 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 06 16:16:55 crc kubenswrapper[4763]: I1006 16:16:55.977042 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd7kr\" (UniqueName: \"kubernetes.io/projected/2a8829d0-507a-4834-ae0d-9cc5fc9b988f-kube-api-access-jd7kr\") pod \"2a8829d0-507a-4834-ae0d-9cc5fc9b988f\" (UID: \"2a8829d0-507a-4834-ae0d-9cc5fc9b988f\") " Oct 06 16:16:55 crc kubenswrapper[4763]: I1006 16:16:55.986694 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8829d0-507a-4834-ae0d-9cc5fc9b988f-kube-api-access-jd7kr" (OuterVolumeSpecName: "kube-api-access-jd7kr") pod "2a8829d0-507a-4834-ae0d-9cc5fc9b988f" (UID: "2a8829d0-507a-4834-ae0d-9cc5fc9b988f"). InnerVolumeSpecName "kube-api-access-jd7kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.073152 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 06 16:16:56 crc kubenswrapper[4763]: E1006 16:16:56.073526 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8829d0-507a-4834-ae0d-9cc5fc9b988f" containerName="mariadb-client" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.073543 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8829d0-507a-4834-ae0d-9cc5fc9b988f" containerName="mariadb-client" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.073815 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8829d0-507a-4834-ae0d-9cc5fc9b988f" containerName="mariadb-client" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.074439 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.079129 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd7kr\" (UniqueName: \"kubernetes.io/projected/2a8829d0-507a-4834-ae0d-9cc5fc9b988f-kube-api-access-jd7kr\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.080207 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.179971 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptv2x\" (UniqueName: \"kubernetes.io/projected/42fe9fdd-b08a-4c1b-b744-e921c528eda9-kube-api-access-ptv2x\") pod \"mariadb-client\" (UID: \"42fe9fdd-b08a-4c1b-b744-e921c528eda9\") " pod="openstack/mariadb-client" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.281269 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptv2x\" (UniqueName: \"kubernetes.io/projected/42fe9fdd-b08a-4c1b-b744-e921c528eda9-kube-api-access-ptv2x\") pod \"mariadb-client\" (UID: \"42fe9fdd-b08a-4c1b-b744-e921c528eda9\") " pod="openstack/mariadb-client" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.296207 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptv2x\" (UniqueName: \"kubernetes.io/projected/42fe9fdd-b08a-4c1b-b744-e921c528eda9-kube-api-access-ptv2x\") pod \"mariadb-client\" (UID: \"42fe9fdd-b08a-4c1b-b744-e921c528eda9\") " pod="openstack/mariadb-client" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.402371 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.581248 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14647f2aad3d253e1c7fd251860c21f65c6d70746951915335842043abb6479a" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.581327 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.599515 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="2a8829d0-507a-4834-ae0d-9cc5fc9b988f" podUID="42fe9fdd-b08a-4c1b-b744-e921c528eda9" Oct 06 16:16:56 crc kubenswrapper[4763]: I1006 16:16:56.884976 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 06 16:16:57 crc kubenswrapper[4763]: I1006 16:16:57.582818 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8829d0-507a-4834-ae0d-9cc5fc9b988f" path="/var/lib/kubelet/pods/2a8829d0-507a-4834-ae0d-9cc5fc9b988f/volumes" Oct 06 16:16:57 crc kubenswrapper[4763]: I1006 16:16:57.587837 4763 generic.go:334] "Generic (PLEG): container finished" podID="42fe9fdd-b08a-4c1b-b744-e921c528eda9" containerID="defafed7fed188dde09379e300ace0911cc8ba44a1acc3e7b176adf6b33461a2" exitCode=0 Oct 06 16:16:57 crc kubenswrapper[4763]: I1006 16:16:57.587874 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"42fe9fdd-b08a-4c1b-b744-e921c528eda9","Type":"ContainerDied","Data":"defafed7fed188dde09379e300ace0911cc8ba44a1acc3e7b176adf6b33461a2"} Oct 06 16:16:57 crc kubenswrapper[4763]: I1006 16:16:57.587896 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"42fe9fdd-b08a-4c1b-b744-e921c528eda9","Type":"ContainerStarted","Data":"a90b6f140af2a80e13c397eba6d43c0bc516e902f39a2e69c94e7229c508754f"} Oct 06 16:16:58 crc kubenswrapper[4763]: I1006 16:16:58.888813 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 06 16:16:58 crc kubenswrapper[4763]: I1006 16:16:58.911352 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_42fe9fdd-b08a-4c1b-b744-e921c528eda9/mariadb-client/0.log" Oct 06 16:16:58 crc kubenswrapper[4763]: I1006 16:16:58.937503 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 06 16:16:58 crc kubenswrapper[4763]: I1006 16:16:58.942757 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 06 16:16:59 crc kubenswrapper[4763]: I1006 16:16:59.021407 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptv2x\" (UniqueName: \"kubernetes.io/projected/42fe9fdd-b08a-4c1b-b744-e921c528eda9-kube-api-access-ptv2x\") pod \"42fe9fdd-b08a-4c1b-b744-e921c528eda9\" (UID: \"42fe9fdd-b08a-4c1b-b744-e921c528eda9\") " Oct 06 16:16:59 crc kubenswrapper[4763]: I1006 16:16:59.029846 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42fe9fdd-b08a-4c1b-b744-e921c528eda9-kube-api-access-ptv2x" (OuterVolumeSpecName: "kube-api-access-ptv2x") pod "42fe9fdd-b08a-4c1b-b744-e921c528eda9" (UID: "42fe9fdd-b08a-4c1b-b744-e921c528eda9"). InnerVolumeSpecName "kube-api-access-ptv2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:16:59 crc kubenswrapper[4763]: I1006 16:16:59.124041 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptv2x\" (UniqueName: \"kubernetes.io/projected/42fe9fdd-b08a-4c1b-b744-e921c528eda9-kube-api-access-ptv2x\") on node \"crc\" DevicePath \"\"" Oct 06 16:16:59 crc kubenswrapper[4763]: I1006 16:16:59.582428 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42fe9fdd-b08a-4c1b-b744-e921c528eda9" path="/var/lib/kubelet/pods/42fe9fdd-b08a-4c1b-b744-e921c528eda9/volumes" Oct 06 16:16:59 crc kubenswrapper[4763]: I1006 16:16:59.601681 4763 scope.go:117] "RemoveContainer" containerID="defafed7fed188dde09379e300ace0911cc8ba44a1acc3e7b176adf6b33461a2" Oct 06 16:16:59 crc kubenswrapper[4763]: I1006 16:16:59.601785 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 06 16:17:03 crc kubenswrapper[4763]: I1006 16:17:03.876737 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:17:03 crc kubenswrapper[4763]: I1006 16:17:03.877380 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.700604 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 16:17:31 crc kubenswrapper[4763]: E1006 16:17:31.701680 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fe9fdd-b08a-4c1b-b744-e921c528eda9" containerName="mariadb-client" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.701694 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fe9fdd-b08a-4c1b-b744-e921c528eda9" containerName="mariadb-client" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.701899 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="42fe9fdd-b08a-4c1b-b744-e921c528eda9" containerName="mariadb-client" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.702919 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.705259 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.705551 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-rnmx4" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.705932 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.708569 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.715205 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.716585 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.745105 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.746389 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.755028 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.769473 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.839780 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.839868 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-config\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.839908 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-config\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.839979 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzm24\" (UniqueName: \"kubernetes.io/projected/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-kube-api-access-hzm24\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.840052 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.840075 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.840099 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fafdd004-0a18-41e1-95b1-dfe81b428b49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fafdd004-0a18-41e1-95b1-dfe81b428b49\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.840120 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.840136 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.840155 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9bdc236e-7379-4fcb-94f2-a9e92dbdd78c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bdc236e-7379-4fcb-94f2-a9e92dbdd78c\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.840190 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh24b\" (UniqueName: \"kubernetes.io/projected/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-kube-api-access-hh24b\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.840215 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.886854 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.894853 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.901373 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.901632 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-6mb9w" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.901797 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.918090 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.930017 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.931718 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942327 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942503 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942572 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942607 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8f586e1d-f115-419b-a9a2-d148607c1e40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f586e1d-f115-419b-a9a2-d148607c1e40\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942662 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fafdd004-0a18-41e1-95b1-dfe81b428b49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fafdd004-0a18-41e1-95b1-dfe81b428b49\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942722 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942750 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9bdc236e-7379-4fcb-94f2-a9e92dbdd78c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bdc236e-7379-4fcb-94f2-a9e92dbdd78c\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942811 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh24b\" (UniqueName: \"kubernetes.io/projected/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-kube-api-access-hh24b\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942845 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c3e6677-0111-41dd-90c8-46dee432289f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c3e6677-0111-41dd-90c8-46dee432289f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942911 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942932 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vzfn\" (UniqueName: \"kubernetes.io/projected/3c3e6677-0111-41dd-90c8-46dee432289f-kube-api-access-2vzfn\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.942985 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.943009 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-config\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.943031 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-config\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.943060 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3e6677-0111-41dd-90c8-46dee432289f-config\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.943091 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c3e6677-0111-41dd-90c8-46dee432289f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.943114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzm24\" (UniqueName: \"kubernetes.io/projected/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-kube-api-access-hzm24\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.943972 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.948516 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.948941 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.949770 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-config\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.951111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.951778 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.951838 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-config\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.953089 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.953129 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9bdc236e-7379-4fcb-94f2-a9e92dbdd78c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bdc236e-7379-4fcb-94f2-a9e92dbdd78c\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8920048066e909723dd2bd6482719418326fb4bf01066c99e3d10cec4ca81447/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.954369 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.954442 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fafdd004-0a18-41e1-95b1-dfe81b428b49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fafdd004-0a18-41e1-95b1-dfe81b428b49\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e88156e7e62034184819d3eb14ab4ad4b546945416b11d4ed170f04c3e5ba41b/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.956153 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.958121 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.962537 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.968331 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzm24\" (UniqueName: \"kubernetes.io/projected/e5065bdf-e83a-427c-a0cb-c4eaae128dcd-kube-api-access-hzm24\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.970773 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh24b\" (UniqueName: \"kubernetes.io/projected/0e16abe9-25ba-4111-93f8-73a9dddcb7e3-kube-api-access-hh24b\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.986362 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.995841 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fafdd004-0a18-41e1-95b1-dfe81b428b49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fafdd004-0a18-41e1-95b1-dfe81b428b49\") pod \"ovsdbserver-nb-1\" (UID: \"e5065bdf-e83a-427c-a0cb-c4eaae128dcd\") " pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:31 crc kubenswrapper[4763]: I1006 16:17:31.996027 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9bdc236e-7379-4fcb-94f2-a9e92dbdd78c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bdc236e-7379-4fcb-94f2-a9e92dbdd78c\") pod \"ovsdbserver-nb-0\" (UID: \"0e16abe9-25ba-4111-93f8-73a9dddcb7e3\") " pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.044196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aa9ec132-af6a-4109-a05a-d492114d1f52-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.044467 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c3e6677-0111-41dd-90c8-46dee432289f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.044578 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c3e6677-0111-41dd-90c8-46dee432289f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.044695 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3ad324e9-232a-4e68-8d69-d1d0eec3f2d1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ad324e9-232a-4e68-8d69-d1d0eec3f2d1\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.044791 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vzfn\" (UniqueName: \"kubernetes.io/projected/3c3e6677-0111-41dd-90c8-46dee432289f-kube-api-access-2vzfn\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.044892 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486912e6-bb8d-4973-8193-da8a59e0d4c9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.044996 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa9ec132-af6a-4109-a05a-d492114d1f52-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.045091 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa9ec132-af6a-4109-a05a-d492114d1f52-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.045183 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xcz6\" (UniqueName: \"kubernetes.io/projected/aa9ec132-af6a-4109-a05a-d492114d1f52-kube-api-access-6xcz6\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.045764 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c3e6677-0111-41dd-90c8-46dee432289f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.046317 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/486912e6-bb8d-4973-8193-da8a59e0d4c9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.046425 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9ec132-af6a-4109-a05a-d492114d1f52-config\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.046480 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3e6677-0111-41dd-90c8-46dee432289f-config\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.046510 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9063050-15eb-4a66-adfb-846f24c6c9cb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.046539 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3a45a647-1cd8-40cd-b2b1-a2b6c711e4d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a45a647-1cd8-40cd-b2b1-a2b6c711e4d7\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.046573 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c3e6677-0111-41dd-90c8-46dee432289f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.046607 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/486912e6-bb8d-4973-8193-da8a59e0d4c9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.046736 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9063050-15eb-4a66-adfb-846f24c6c9cb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.046783 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9063050-15eb-4a66-adfb-846f24c6c9cb-config\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.046946 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486912e6-bb8d-4973-8193-da8a59e0d4c9-config\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.047023 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9063050-15eb-4a66-adfb-846f24c6c9cb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.047080 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdnsq\" (UniqueName: \"kubernetes.io/projected/486912e6-bb8d-4973-8193-da8a59e0d4c9-kube-api-access-qdnsq\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.047138 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8f586e1d-f115-419b-a9a2-d148607c1e40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f586e1d-f115-419b-a9a2-d148607c1e40\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.047194 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0247fff1-0e5f-405e-9ad0-8ffcc0ae64ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0247fff1-0e5f-405e-9ad0-8ffcc0ae64ca\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.047248 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2chz\" (UniqueName: \"kubernetes.io/projected/a9063050-15eb-4a66-adfb-846f24c6c9cb-kube-api-access-h2chz\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.047278 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c3e6677-0111-41dd-90c8-46dee432289f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.047510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3e6677-0111-41dd-90c8-46dee432289f-config\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.049488 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c3e6677-0111-41dd-90c8-46dee432289f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.049642 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.049751 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8f586e1d-f115-419b-a9a2-d148607c1e40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f586e1d-f115-419b-a9a2-d148607c1e40\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c9fd18ce4dd7271441458c8ab20c9afdcfe938fe253a5ab21dc9fa93bd937af/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.052105 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.060402 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vzfn\" (UniqueName: \"kubernetes.io/projected/3c3e6677-0111-41dd-90c8-46dee432289f-kube-api-access-2vzfn\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.071038 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.090357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8f586e1d-f115-419b-a9a2-d148607c1e40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f586e1d-f115-419b-a9a2-d148607c1e40\") pod \"ovsdbserver-nb-2\" (UID: \"3c3e6677-0111-41dd-90c8-46dee432289f\") " pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.102849 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148489 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3a45a647-1cd8-40cd-b2b1-a2b6c711e4d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a45a647-1cd8-40cd-b2b1-a2b6c711e4d7\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148536 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/486912e6-bb8d-4973-8193-da8a59e0d4c9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148565 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9063050-15eb-4a66-adfb-846f24c6c9cb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148584 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9063050-15eb-4a66-adfb-846f24c6c9cb-config\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148601 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486912e6-bb8d-4973-8193-da8a59e0d4c9-config\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148643 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9063050-15eb-4a66-adfb-846f24c6c9cb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148664 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdnsq\" (UniqueName: \"kubernetes.io/projected/486912e6-bb8d-4973-8193-da8a59e0d4c9-kube-api-access-qdnsq\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148691 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0247fff1-0e5f-405e-9ad0-8ffcc0ae64ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0247fff1-0e5f-405e-9ad0-8ffcc0ae64ca\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2chz\" (UniqueName: \"kubernetes.io/projected/a9063050-15eb-4a66-adfb-846f24c6c9cb-kube-api-access-h2chz\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aa9ec132-af6a-4109-a05a-d492114d1f52-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148772 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3ad324e9-232a-4e68-8d69-d1d0eec3f2d1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ad324e9-232a-4e68-8d69-d1d0eec3f2d1\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486912e6-bb8d-4973-8193-da8a59e0d4c9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148814 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa9ec132-af6a-4109-a05a-d492114d1f52-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148829 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa9ec132-af6a-4109-a05a-d492114d1f52-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148845 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xcz6\" (UniqueName: \"kubernetes.io/projected/aa9ec132-af6a-4109-a05a-d492114d1f52-kube-api-access-6xcz6\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148861 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/486912e6-bb8d-4973-8193-da8a59e0d4c9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148880 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9ec132-af6a-4109-a05a-d492114d1f52-config\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.148900 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9063050-15eb-4a66-adfb-846f24c6c9cb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.149882 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9063050-15eb-4a66-adfb-846f24c6c9cb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.150946 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9063050-15eb-4a66-adfb-846f24c6c9cb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.151043 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9063050-15eb-4a66-adfb-846f24c6c9cb-config\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.151060 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486912e6-bb8d-4973-8193-da8a59e0d4c9-config\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.151369 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/486912e6-bb8d-4973-8193-da8a59e0d4c9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.151397 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa9ec132-af6a-4109-a05a-d492114d1f52-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.151704 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aa9ec132-af6a-4109-a05a-d492114d1f52-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.151801 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/486912e6-bb8d-4973-8193-da8a59e0d4c9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.152252 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9ec132-af6a-4109-a05a-d492114d1f52-config\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.152990 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.153008 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3ad324e9-232a-4e68-8d69-d1d0eec3f2d1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ad324e9-232a-4e68-8d69-d1d0eec3f2d1\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/33dca5f127e379b6895145da4ca6a0293b2ee9a50ed4d8b0a93d04efd80ec398/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.153239 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.153257 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3a45a647-1cd8-40cd-b2b1-a2b6c711e4d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a45a647-1cd8-40cd-b2b1-a2b6c711e4d7\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d3ccfd4fd752c2bf70c5a385251e29f488e27d9e9b8b9914c627e314cae84ac9/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.153443 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.153459 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0247fff1-0e5f-405e-9ad0-8ffcc0ae64ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0247fff1-0e5f-405e-9ad0-8ffcc0ae64ca\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d705aa8181569f523e1003376b2b1d8d43fd08291b096386856b54d419a85cd4/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.155076 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa9ec132-af6a-4109-a05a-d492114d1f52-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.155947 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486912e6-bb8d-4973-8193-da8a59e0d4c9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.173602 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xcz6\" (UniqueName: \"kubernetes.io/projected/aa9ec132-af6a-4109-a05a-d492114d1f52-kube-api-access-6xcz6\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.174008 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdnsq\" (UniqueName: \"kubernetes.io/projected/486912e6-bb8d-4973-8193-da8a59e0d4c9-kube-api-access-qdnsq\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.175317 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9063050-15eb-4a66-adfb-846f24c6c9cb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.176954 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2chz\" (UniqueName: \"kubernetes.io/projected/a9063050-15eb-4a66-adfb-846f24c6c9cb-kube-api-access-h2chz\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.202106 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3ad324e9-232a-4e68-8d69-d1d0eec3f2d1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ad324e9-232a-4e68-8d69-d1d0eec3f2d1\") pod \"ovsdbserver-sb-2\" (UID: \"486912e6-bb8d-4973-8193-da8a59e0d4c9\") " pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.206307 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3a45a647-1cd8-40cd-b2b1-a2b6c711e4d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a45a647-1cd8-40cd-b2b1-a2b6c711e4d7\") pod \"ovsdbserver-sb-1\" (UID: \"aa9ec132-af6a-4109-a05a-d492114d1f52\") " pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.207086 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0247fff1-0e5f-405e-9ad0-8ffcc0ae64ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0247fff1-0e5f-405e-9ad0-8ffcc0ae64ca\") pod \"ovsdbserver-sb-0\" (UID: \"a9063050-15eb-4a66-adfb-846f24c6c9cb\") " pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.226271 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.254358 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.322372 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.622956 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.711340 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 06 16:17:32 crc kubenswrapper[4763]: W1006 16:17:32.732318 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5065bdf_e83a_427c_a0cb_c4eaae128dcd.slice/crio-60188b5bdc6d76bbfbd1e90f87274dfb760b7f86b0d2f52b913d4afbd1e2c162 WatchSource:0}: Error finding container 60188b5bdc6d76bbfbd1e90f87274dfb760b7f86b0d2f52b913d4afbd1e2c162: Status 404 returned error can't find the container with id 60188b5bdc6d76bbfbd1e90f87274dfb760b7f86b0d2f52b913d4afbd1e2c162 Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.808535 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 16:17:32 crc kubenswrapper[4763]: W1006 16:17:32.817733 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9063050_15eb_4a66_adfb_846f24c6c9cb.slice/crio-ace5b46fbcd57dd9704e2c7c5f5b1777db71fe276bc9b35d268b6341a80d2fe2 WatchSource:0}: Error finding container ace5b46fbcd57dd9704e2c7c5f5b1777db71fe276bc9b35d268b6341a80d2fe2: Status 404 returned error can't find the container with id ace5b46fbcd57dd9704e2c7c5f5b1777db71fe276bc9b35d268b6341a80d2fe2 Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.890141 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a9063050-15eb-4a66-adfb-846f24c6c9cb","Type":"ContainerStarted","Data":"ace5b46fbcd57dd9704e2c7c5f5b1777db71fe276bc9b35d268b6341a80d2fe2"} Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.892131 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0e16abe9-25ba-4111-93f8-73a9dddcb7e3","Type":"ContainerStarted","Data":"f8977788f854a56f748f2f93da96361ceebc99f87f1d44087b2e8c678a017071"} Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.892195 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0e16abe9-25ba-4111-93f8-73a9dddcb7e3","Type":"ContainerStarted","Data":"ca3ab7bdc9a5d9a35cb490d9f6a5de89994fe2a8c84aa6aa83d798b41b9b12ec"} Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.893535 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"e5065bdf-e83a-427c-a0cb-c4eaae128dcd","Type":"ContainerStarted","Data":"60188b5bdc6d76bbfbd1e90f87274dfb760b7f86b0d2f52b913d4afbd1e2c162"} Oct 06 16:17:32 crc kubenswrapper[4763]: I1006 16:17:32.923837 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.259553 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 06 16:17:33 crc kubenswrapper[4763]: W1006 16:17:33.263510 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c3e6677_0111_41dd_90c8_46dee432289f.slice/crio-82cac4035b9ca253a1c30bc999ffbc4872e909ccf3a23556d2f3996416c8e4fe WatchSource:0}: Error finding container 82cac4035b9ca253a1c30bc999ffbc4872e909ccf3a23556d2f3996416c8e4fe: Status 404 returned error can't find the container with id 82cac4035b9ca253a1c30bc999ffbc4872e909ccf3a23556d2f3996416c8e4fe Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.773063 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 06 16:17:33 crc kubenswrapper[4763]: W1006 16:17:33.786462 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa9ec132_af6a_4109_a05a_d492114d1f52.slice/crio-243083276e53f129a0bfd8b3e28cf0216a62600438c71073997e760eaeac47e9 WatchSource:0}: Error finding container 243083276e53f129a0bfd8b3e28cf0216a62600438c71073997e760eaeac47e9: Status 404 returned error can't find the container with id 243083276e53f129a0bfd8b3e28cf0216a62600438c71073997e760eaeac47e9 Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.876440 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.876507 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.905003 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"486912e6-bb8d-4973-8193-da8a59e0d4c9","Type":"ContainerStarted","Data":"c01e890350e580dfffaa3553001cda5fd967d7f2cf4b42d6f030162793ed0c58"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.905045 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"486912e6-bb8d-4973-8193-da8a59e0d4c9","Type":"ContainerStarted","Data":"e14b901e962546b882b99855fa09fafe6be7e695047ea7879fd31a7d46691fed"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.905056 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"486912e6-bb8d-4973-8193-da8a59e0d4c9","Type":"ContainerStarted","Data":"055c7b1d59a35bf35638e3907463cf81f078ed2f5ef77907a0505fb49103fbf7"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.908897 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a9063050-15eb-4a66-adfb-846f24c6c9cb","Type":"ContainerStarted","Data":"e9f43848d03a07ec48b01db3ebe5401b95e7388ea14624954873c72ca77d087c"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.908947 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a9063050-15eb-4a66-adfb-846f24c6c9cb","Type":"ContainerStarted","Data":"bab976c04f6490d1170dd5c6ab512c4f330918599fe179a049531b99216668b8"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.911021 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0e16abe9-25ba-4111-93f8-73a9dddcb7e3","Type":"ContainerStarted","Data":"16ef788be6c0b1be687a1cfb14cbc7c7d773cbd2ee169745a034f32859325bca"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.912923 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3c3e6677-0111-41dd-90c8-46dee432289f","Type":"ContainerStarted","Data":"2ca2733c79bfd7313366d03761b1d1bd7c6f32e3ed67a1ea788b113b0a723045"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.912971 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3c3e6677-0111-41dd-90c8-46dee432289f","Type":"ContainerStarted","Data":"27b33116bd7ae45c854bc086a9328d65f90c44b0d424eafd344c289d1935a747"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.912994 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3c3e6677-0111-41dd-90c8-46dee432289f","Type":"ContainerStarted","Data":"82cac4035b9ca253a1c30bc999ffbc4872e909ccf3a23556d2f3996416c8e4fe"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.914220 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"aa9ec132-af6a-4109-a05a-d492114d1f52","Type":"ContainerStarted","Data":"243083276e53f129a0bfd8b3e28cf0216a62600438c71073997e760eaeac47e9"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.916018 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"e5065bdf-e83a-427c-a0cb-c4eaae128dcd","Type":"ContainerStarted","Data":"8d6d9b4df9b9c94f28cdd6447d2b570b73d63dcd285a91d1a33caceb2074e66c"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.916043 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"e5065bdf-e83a-427c-a0cb-c4eaae128dcd","Type":"ContainerStarted","Data":"f8d590d15e1212b4e5e9fe310e49a6c8c591beac455b9a0b77766d8db2ddbd91"} Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.935592 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.935576899 podStartE2EDuration="3.935576899s" podCreationTimestamp="2025-10-06 16:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:17:33.930160144 +0000 UTC m=+5051.085452666" watchObservedRunningTime="2025-10-06 16:17:33.935576899 +0000 UTC m=+5051.090869411" Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.952594 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.952574915 podStartE2EDuration="3.952574915s" podCreationTimestamp="2025-10-06 16:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:17:33.945427343 +0000 UTC m=+5051.100719855" watchObservedRunningTime="2025-10-06 16:17:33.952574915 +0000 UTC m=+5051.107867427" Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.967240 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.967222147 podStartE2EDuration="3.967222147s" podCreationTimestamp="2025-10-06 16:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:17:33.964927606 +0000 UTC m=+5051.120220148" watchObservedRunningTime="2025-10-06 16:17:33.967222147 +0000 UTC m=+5051.122514659" Oct 06 16:17:33 crc kubenswrapper[4763]: I1006 16:17:33.984814 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.984799938 podStartE2EDuration="3.984799938s" podCreationTimestamp="2025-10-06 16:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:17:33.982299231 +0000 UTC m=+5051.137591733" watchObservedRunningTime="2025-10-06 16:17:33.984799938 +0000 UTC m=+5051.140092450" Oct 06 16:17:34 crc kubenswrapper[4763]: I1006 16:17:34.008659 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.008636247 podStartE2EDuration="4.008636247s" podCreationTimestamp="2025-10-06 16:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:17:34.002915034 +0000 UTC m=+5051.158207536" watchObservedRunningTime="2025-10-06 16:17:34.008636247 +0000 UTC m=+5051.163928769" Oct 06 16:17:34 crc kubenswrapper[4763]: I1006 16:17:34.923742 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"aa9ec132-af6a-4109-a05a-d492114d1f52","Type":"ContainerStarted","Data":"93f5224ddd764d24825244f1f2fa93792efd37103086d03aef8b3986b2d1b453"} Oct 06 16:17:34 crc kubenswrapper[4763]: I1006 16:17:34.923793 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"aa9ec132-af6a-4109-a05a-d492114d1f52","Type":"ContainerStarted","Data":"2ba8f35e5da12c8b6524a1769e62e2fc038b479a1b5a3dc48e4d9f5ead4b34ec"} Oct 06 16:17:34 crc kubenswrapper[4763]: I1006 16:17:34.944541 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.944519852 podStartE2EDuration="4.944519852s" podCreationTimestamp="2025-10-06 16:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:17:34.938357707 +0000 UTC m=+5052.093650229" watchObservedRunningTime="2025-10-06 16:17:34.944519852 +0000 UTC m=+5052.099812364" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.052914 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.071977 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.103947 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.117017 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.140946 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.226993 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.255629 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.292043 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.323444 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.933382 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.933423 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:35 crc kubenswrapper[4763]: I1006 16:17:35.933434 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.104010 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.126076 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.142154 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.256108 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.274692 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.323289 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.416588 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-5jnk5"] Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.419128 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.421501 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.431586 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-5jnk5"] Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.542448 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-5jnk5"] Oct 06 16:17:37 crc kubenswrapper[4763]: E1006 16:17:37.543087 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-998fp ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" podUID="cd6fba53-49ad-4630-b379-760663489787" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.543366 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-ovsdbserver-nb\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.543506 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-config\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.543536 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-dns-svc\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.543564 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-998fp\" (UniqueName: \"kubernetes.io/projected/cd6fba53-49ad-4630-b379-760663489787-kube-api-access-998fp\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.566764 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-2fvtj"] Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.567997 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.570773 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.595396 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-2fvtj"] Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.645485 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-config\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.645534 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-dns-svc\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.645566 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-998fp\" (UniqueName: \"kubernetes.io/projected/cd6fba53-49ad-4630-b379-760663489787-kube-api-access-998fp\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.645645 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-config\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.645692 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-dns-svc\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.645717 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-ovsdbserver-nb\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.645764 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-nb\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.645824 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-sb\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.645883 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbpgv\" (UniqueName: \"kubernetes.io/projected/0a2b2192-1baf-4aef-929c-908e4b0afd12-kube-api-access-bbpgv\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.647281 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-config\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.647930 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-dns-svc\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.649553 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-ovsdbserver-nb\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.682351 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-998fp\" (UniqueName: \"kubernetes.io/projected/cd6fba53-49ad-4630-b379-760663489787-kube-api-access-998fp\") pod \"dnsmasq-dns-547968cc8f-5jnk5\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.747885 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-dns-svc\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.748001 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-nb\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.749085 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-dns-svc\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.749085 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-nb\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.749127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-sb\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.749171 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-sb\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.749256 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbpgv\" (UniqueName: \"kubernetes.io/projected/0a2b2192-1baf-4aef-929c-908e4b0afd12-kube-api-access-bbpgv\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.749442 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-config\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.750071 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-config\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.769445 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbpgv\" (UniqueName: \"kubernetes.io/projected/0a2b2192-1baf-4aef-929c-908e4b0afd12-kube-api-access-bbpgv\") pod \"dnsmasq-dns-7c54468fdc-2fvtj\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.885531 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:37 crc kubenswrapper[4763]: I1006 16:17:37.951661 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.005125 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.055304 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-config\") pod \"cd6fba53-49ad-4630-b379-760663489787\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.055360 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-ovsdbserver-nb\") pod \"cd6fba53-49ad-4630-b379-760663489787\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.055389 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-dns-svc\") pod \"cd6fba53-49ad-4630-b379-760663489787\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.055572 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-998fp\" (UniqueName: \"kubernetes.io/projected/cd6fba53-49ad-4630-b379-760663489787-kube-api-access-998fp\") pod \"cd6fba53-49ad-4630-b379-760663489787\" (UID: \"cd6fba53-49ad-4630-b379-760663489787\") " Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.056275 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-config" (OuterVolumeSpecName: "config") pod "cd6fba53-49ad-4630-b379-760663489787" (UID: "cd6fba53-49ad-4630-b379-760663489787"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.056314 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cd6fba53-49ad-4630-b379-760663489787" (UID: "cd6fba53-49ad-4630-b379-760663489787"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.056710 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cd6fba53-49ad-4630-b379-760663489787" (UID: "cd6fba53-49ad-4630-b379-760663489787"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.070183 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd6fba53-49ad-4630-b379-760663489787-kube-api-access-998fp" (OuterVolumeSpecName: "kube-api-access-998fp") pod "cd6fba53-49ad-4630-b379-760663489787" (UID: "cd6fba53-49ad-4630-b379-760663489787"). InnerVolumeSpecName "kube-api-access-998fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.141234 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.159716 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-998fp\" (UniqueName: \"kubernetes.io/projected/cd6fba53-49ad-4630-b379-760663489787-kube-api-access-998fp\") on node \"crc\" DevicePath \"\"" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.159771 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.159792 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.159806 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd6fba53-49ad-4630-b379-760663489787-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.186070 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.300306 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:38 crc kubenswrapper[4763]: W1006 16:17:38.332691 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a2b2192_1baf_4aef_929c_908e4b0afd12.slice/crio-905d78a514cc5d3f81b322b789c1b3385a07035714e56b85710b75c0d9f11095 WatchSource:0}: Error finding container 905d78a514cc5d3f81b322b789c1b3385a07035714e56b85710b75c0d9f11095: Status 404 returned error can't find the container with id 905d78a514cc5d3f81b322b789c1b3385a07035714e56b85710b75c0d9f11095 Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.336484 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-2fvtj"] Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.356396 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.360107 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.971472 4763 generic.go:334] "Generic (PLEG): container finished" podID="0a2b2192-1baf-4aef-929c-908e4b0afd12" containerID="c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad" exitCode=0 Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.971564 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" event={"ID":"0a2b2192-1baf-4aef-929c-908e4b0afd12","Type":"ContainerDied","Data":"c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad"} Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.972112 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" event={"ID":"0a2b2192-1baf-4aef-929c-908e4b0afd12","Type":"ContainerStarted","Data":"905d78a514cc5d3f81b322b789c1b3385a07035714e56b85710b75c0d9f11095"} Oct 06 16:17:38 crc kubenswrapper[4763]: I1006 16:17:38.972156 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547968cc8f-5jnk5" Oct 06 16:17:39 crc kubenswrapper[4763]: I1006 16:17:39.078660 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 06 16:17:39 crc kubenswrapper[4763]: I1006 16:17:39.203973 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-5jnk5"] Oct 06 16:17:39 crc kubenswrapper[4763]: I1006 16:17:39.209056 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-5jnk5"] Oct 06 16:17:39 crc kubenswrapper[4763]: I1006 16:17:39.590899 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd6fba53-49ad-4630-b379-760663489787" path="/var/lib/kubelet/pods/cd6fba53-49ad-4630-b379-760663489787/volumes" Oct 06 16:17:39 crc kubenswrapper[4763]: I1006 16:17:39.990308 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" event={"ID":"0a2b2192-1baf-4aef-929c-908e4b0afd12","Type":"ContainerStarted","Data":"ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885"} Oct 06 16:17:40 crc kubenswrapper[4763]: I1006 16:17:40.025819 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" podStartSLOduration=3.025795812 podStartE2EDuration="3.025795812s" podCreationTimestamp="2025-10-06 16:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:17:40.024528708 +0000 UTC m=+5057.179821250" watchObservedRunningTime="2025-10-06 16:17:40.025795812 +0000 UTC m=+5057.181088334" Oct 06 16:17:40 crc kubenswrapper[4763]: I1006 16:17:40.999077 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.588051 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.589100 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.593430 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.596351 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.733775 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/dec9fc58-5530-46e0-8518-edd126a266f8-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"dec9fc58-5530-46e0-8518-edd126a266f8\") " pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.734114 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-27191f76-aa4f-4e4a-890b-727bb5e3784f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27191f76-aa4f-4e4a-890b-727bb5e3784f\") pod \"ovn-copy-data\" (UID: \"dec9fc58-5530-46e0-8518-edd126a266f8\") " pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.734197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7ss6\" (UniqueName: \"kubernetes.io/projected/dec9fc58-5530-46e0-8518-edd126a266f8-kube-api-access-q7ss6\") pod \"ovn-copy-data\" (UID: \"dec9fc58-5530-46e0-8518-edd126a266f8\") " pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.835764 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7ss6\" (UniqueName: \"kubernetes.io/projected/dec9fc58-5530-46e0-8518-edd126a266f8-kube-api-access-q7ss6\") pod \"ovn-copy-data\" (UID: \"dec9fc58-5530-46e0-8518-edd126a266f8\") " pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.835884 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/dec9fc58-5530-46e0-8518-edd126a266f8-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"dec9fc58-5530-46e0-8518-edd126a266f8\") " pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.836076 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-27191f76-aa4f-4e4a-890b-727bb5e3784f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27191f76-aa4f-4e4a-890b-727bb5e3784f\") pod \"ovn-copy-data\" (UID: \"dec9fc58-5530-46e0-8518-edd126a266f8\") " pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.841413 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.841491 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-27191f76-aa4f-4e4a-890b-727bb5e3784f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27191f76-aa4f-4e4a-890b-727bb5e3784f\") pod \"ovn-copy-data\" (UID: \"dec9fc58-5530-46e0-8518-edd126a266f8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aefa9e286e4c09ce77a13e2134d6b005ddc88318fd37346bdb4a99dfa3578824/globalmount\"" pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.847069 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/dec9fc58-5530-46e0-8518-edd126a266f8-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"dec9fc58-5530-46e0-8518-edd126a266f8\") " pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.864243 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7ss6\" (UniqueName: \"kubernetes.io/projected/dec9fc58-5530-46e0-8518-edd126a266f8-kube-api-access-q7ss6\") pod \"ovn-copy-data\" (UID: \"dec9fc58-5530-46e0-8518-edd126a266f8\") " pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.893535 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-27191f76-aa4f-4e4a-890b-727bb5e3784f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27191f76-aa4f-4e4a-890b-727bb5e3784f\") pod \"ovn-copy-data\" (UID: \"dec9fc58-5530-46e0-8518-edd126a266f8\") " pod="openstack/ovn-copy-data" Oct 06 16:17:41 crc kubenswrapper[4763]: I1006 16:17:41.921725 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 06 16:17:42 crc kubenswrapper[4763]: I1006 16:17:42.470781 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 06 16:17:42 crc kubenswrapper[4763]: W1006 16:17:42.480803 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddec9fc58_5530_46e0_8518_edd126a266f8.slice/crio-863f665a2ff5daedcbfb46a0b4ed29ebdecd723af15a06f80e68ec076f5fedde WatchSource:0}: Error finding container 863f665a2ff5daedcbfb46a0b4ed29ebdecd723af15a06f80e68ec076f5fedde: Status 404 returned error can't find the container with id 863f665a2ff5daedcbfb46a0b4ed29ebdecd723af15a06f80e68ec076f5fedde Oct 06 16:17:43 crc kubenswrapper[4763]: I1006 16:17:43.021647 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"dec9fc58-5530-46e0-8518-edd126a266f8","Type":"ContainerStarted","Data":"2ff1df9634281de06444581e97ec128a34e7aa40adb179533e5b8dafe61d2b67"} Oct 06 16:17:43 crc kubenswrapper[4763]: I1006 16:17:43.021714 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"dec9fc58-5530-46e0-8518-edd126a266f8","Type":"ContainerStarted","Data":"863f665a2ff5daedcbfb46a0b4ed29ebdecd723af15a06f80e68ec076f5fedde"} Oct 06 16:17:43 crc kubenswrapper[4763]: I1006 16:17:43.052710 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.052677871 podStartE2EDuration="3.052677871s" podCreationTimestamp="2025-10-06 16:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:17:43.045418507 +0000 UTC m=+5060.200711029" watchObservedRunningTime="2025-10-06 16:17:43.052677871 +0000 UTC m=+5060.207970433" Oct 06 16:17:47 crc kubenswrapper[4763]: I1006 16:17:47.887855 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:17:47 crc kubenswrapper[4763]: I1006 16:17:47.951005 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjx7t"] Oct 06 16:17:47 crc kubenswrapper[4763]: I1006 16:17:47.951683 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" podUID="1c316e7d-6e65-42c3-b67d-93fc3027979d" containerName="dnsmasq-dns" containerID="cri-o://46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af" gracePeriod=10 Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.488549 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.560274 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5xg5\" (UniqueName: \"kubernetes.io/projected/1c316e7d-6e65-42c3-b67d-93fc3027979d-kube-api-access-s5xg5\") pod \"1c316e7d-6e65-42c3-b67d-93fc3027979d\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.560749 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-config\") pod \"1c316e7d-6e65-42c3-b67d-93fc3027979d\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.560831 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-dns-svc\") pod \"1c316e7d-6e65-42c3-b67d-93fc3027979d\" (UID: \"1c316e7d-6e65-42c3-b67d-93fc3027979d\") " Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.567794 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c316e7d-6e65-42c3-b67d-93fc3027979d-kube-api-access-s5xg5" (OuterVolumeSpecName: "kube-api-access-s5xg5") pod "1c316e7d-6e65-42c3-b67d-93fc3027979d" (UID: "1c316e7d-6e65-42c3-b67d-93fc3027979d"). InnerVolumeSpecName "kube-api-access-s5xg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.567997 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 06 16:17:48 crc kubenswrapper[4763]: E1006 16:17:48.568391 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c316e7d-6e65-42c3-b67d-93fc3027979d" containerName="dnsmasq-dns" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.568421 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c316e7d-6e65-42c3-b67d-93fc3027979d" containerName="dnsmasq-dns" Oct 06 16:17:48 crc kubenswrapper[4763]: E1006 16:17:48.568463 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c316e7d-6e65-42c3-b67d-93fc3027979d" containerName="init" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.568472 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c316e7d-6e65-42c3-b67d-93fc3027979d" containerName="init" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.578153 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c316e7d-6e65-42c3-b67d-93fc3027979d" containerName="dnsmasq-dns" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.580531 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.585820 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.586010 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-dpzwg" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.586154 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.589523 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.650931 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-config" (OuterVolumeSpecName: "config") pod "1c316e7d-6e65-42c3-b67d-93fc3027979d" (UID: "1c316e7d-6e65-42c3-b67d-93fc3027979d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.651117 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c316e7d-6e65-42c3-b67d-93fc3027979d" (UID: "1c316e7d-6e65-42c3-b67d-93fc3027979d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.662672 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63c57da-3a5e-4144-a114-1089ea5f4ed6-scripts\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.662902 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqqtm\" (UniqueName: \"kubernetes.io/projected/c63c57da-3a5e-4144-a114-1089ea5f4ed6-kube-api-access-sqqtm\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.662947 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c63c57da-3a5e-4144-a114-1089ea5f4ed6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.662991 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c63c57da-3a5e-4144-a114-1089ea5f4ed6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.663028 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c63c57da-3a5e-4144-a114-1089ea5f4ed6-config\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.663222 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5xg5\" (UniqueName: \"kubernetes.io/projected/1c316e7d-6e65-42c3-b67d-93fc3027979d-kube-api-access-s5xg5\") on node \"crc\" DevicePath \"\"" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.663237 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.663246 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c316e7d-6e65-42c3-b67d-93fc3027979d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.764475 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqqtm\" (UniqueName: \"kubernetes.io/projected/c63c57da-3a5e-4144-a114-1089ea5f4ed6-kube-api-access-sqqtm\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.764531 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c63c57da-3a5e-4144-a114-1089ea5f4ed6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.764564 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c63c57da-3a5e-4144-a114-1089ea5f4ed6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.764591 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c63c57da-3a5e-4144-a114-1089ea5f4ed6-config\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.764708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63c57da-3a5e-4144-a114-1089ea5f4ed6-scripts\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.765806 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63c57da-3a5e-4144-a114-1089ea5f4ed6-scripts\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.766093 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c63c57da-3a5e-4144-a114-1089ea5f4ed6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.766676 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c63c57da-3a5e-4144-a114-1089ea5f4ed6-config\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.768982 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c63c57da-3a5e-4144-a114-1089ea5f4ed6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.786106 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqqtm\" (UniqueName: \"kubernetes.io/projected/c63c57da-3a5e-4144-a114-1089ea5f4ed6-kube-api-access-sqqtm\") pod \"ovn-northd-0\" (UID: \"c63c57da-3a5e-4144-a114-1089ea5f4ed6\") " pod="openstack/ovn-northd-0" Oct 06 16:17:48 crc kubenswrapper[4763]: I1006 16:17:48.953113 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.075775 4763 generic.go:334] "Generic (PLEG): container finished" podID="1c316e7d-6e65-42c3-b67d-93fc3027979d" containerID="46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af" exitCode=0 Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.076127 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" event={"ID":"1c316e7d-6e65-42c3-b67d-93fc3027979d","Type":"ContainerDied","Data":"46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af"} Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.076160 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" event={"ID":"1c316e7d-6e65-42c3-b67d-93fc3027979d","Type":"ContainerDied","Data":"a06c5feb4065a923a7995d9fb98189e037fa30423beac30bd820785fa5154a61"} Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.076181 4763 scope.go:117] "RemoveContainer" containerID="46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af" Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.076329 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pjx7t" Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.154936 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjx7t"] Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.160123 4763 scope.go:117] "RemoveContainer" containerID="4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c" Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.160230 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjx7t"] Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.204069 4763 scope.go:117] "RemoveContainer" containerID="46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af" Oct 06 16:17:49 crc kubenswrapper[4763]: E1006 16:17:49.204526 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af\": container with ID starting with 46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af not found: ID does not exist" containerID="46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af" Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.204599 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af"} err="failed to get container status \"46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af\": rpc error: code = NotFound desc = could not find container \"46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af\": container with ID starting with 46f6e666746cac3a24da5be1936d8f454631f7e5b7daaeef5bc3395bfe4841af not found: ID does not exist" Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.204666 4763 scope.go:117] "RemoveContainer" containerID="4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c" Oct 06 16:17:49 crc kubenswrapper[4763]: E1006 16:17:49.205215 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c\": container with ID starting with 4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c not found: ID does not exist" containerID="4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c" Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.205243 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c"} err="failed to get container status \"4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c\": rpc error: code = NotFound desc = could not find container \"4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c\": container with ID starting with 4f37b85d3d0c7e769943792670c5c5141e88654e20cb64c394210d4437bc087c not found: ID does not exist" Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.442957 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 16:17:49 crc kubenswrapper[4763]: I1006 16:17:49.585038 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c316e7d-6e65-42c3-b67d-93fc3027979d" path="/var/lib/kubelet/pods/1c316e7d-6e65-42c3-b67d-93fc3027979d/volumes" Oct 06 16:17:50 crc kubenswrapper[4763]: I1006 16:17:50.102016 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c63c57da-3a5e-4144-a114-1089ea5f4ed6","Type":"ContainerStarted","Data":"7ba06d111b206917e2125f04e3f6afa34f1ad776427cc341002d0c9401166fa7"} Oct 06 16:17:50 crc kubenswrapper[4763]: I1006 16:17:50.102098 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c63c57da-3a5e-4144-a114-1089ea5f4ed6","Type":"ContainerStarted","Data":"1b1fabaf0c28a477cc05a855fb747ba65341f6764e10da94b648839464e5f0dc"} Oct 06 16:17:50 crc kubenswrapper[4763]: I1006 16:17:50.102123 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c63c57da-3a5e-4144-a114-1089ea5f4ed6","Type":"ContainerStarted","Data":"708ef88df1a1d2c765f552c2b08d99dae269a279be472796f25d7e36d8e41a58"} Oct 06 16:17:50 crc kubenswrapper[4763]: I1006 16:17:50.102146 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 06 16:17:50 crc kubenswrapper[4763]: I1006 16:17:50.128439 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.12841631 podStartE2EDuration="2.12841631s" podCreationTimestamp="2025-10-06 16:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:17:50.12542926 +0000 UTC m=+5067.280721782" watchObservedRunningTime="2025-10-06 16:17:50.12841631 +0000 UTC m=+5067.283708832" Oct 06 16:17:54 crc kubenswrapper[4763]: I1006 16:17:54.316584 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wgs95"] Oct 06 16:17:54 crc kubenswrapper[4763]: I1006 16:17:54.318129 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wgs95" Oct 06 16:17:54 crc kubenswrapper[4763]: I1006 16:17:54.326923 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wgs95"] Oct 06 16:17:54 crc kubenswrapper[4763]: I1006 16:17:54.368472 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plqbq\" (UniqueName: \"kubernetes.io/projected/2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1-kube-api-access-plqbq\") pod \"keystone-db-create-wgs95\" (UID: \"2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1\") " pod="openstack/keystone-db-create-wgs95" Oct 06 16:17:54 crc kubenswrapper[4763]: I1006 16:17:54.470432 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plqbq\" (UniqueName: \"kubernetes.io/projected/2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1-kube-api-access-plqbq\") pod \"keystone-db-create-wgs95\" (UID: \"2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1\") " pod="openstack/keystone-db-create-wgs95" Oct 06 16:17:54 crc kubenswrapper[4763]: I1006 16:17:54.499979 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plqbq\" (UniqueName: \"kubernetes.io/projected/2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1-kube-api-access-plqbq\") pod \"keystone-db-create-wgs95\" (UID: \"2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1\") " pod="openstack/keystone-db-create-wgs95" Oct 06 16:17:54 crc kubenswrapper[4763]: I1006 16:17:54.647011 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wgs95" Oct 06 16:17:55 crc kubenswrapper[4763]: I1006 16:17:55.130371 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wgs95"] Oct 06 16:17:55 crc kubenswrapper[4763]: I1006 16:17:55.159587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wgs95" event={"ID":"2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1","Type":"ContainerStarted","Data":"b86887e26a9d50ab8bc7b1592a8e7f84a9bcd7cc7ca4fedb353211b0a573b41c"} Oct 06 16:17:56 crc kubenswrapper[4763]: I1006 16:17:56.171924 4763 generic.go:334] "Generic (PLEG): container finished" podID="2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1" containerID="4b122b9a4336f1356347c1bba23fa0c9a87c656067e34a512b71cafce3c44a63" exitCode=0 Oct 06 16:17:56 crc kubenswrapper[4763]: I1006 16:17:56.172033 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wgs95" event={"ID":"2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1","Type":"ContainerDied","Data":"4b122b9a4336f1356347c1bba23fa0c9a87c656067e34a512b71cafce3c44a63"} Oct 06 16:17:57 crc kubenswrapper[4763]: I1006 16:17:57.509483 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wgs95" Oct 06 16:17:57 crc kubenswrapper[4763]: I1006 16:17:57.535435 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plqbq\" (UniqueName: \"kubernetes.io/projected/2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1-kube-api-access-plqbq\") pod \"2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1\" (UID: \"2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1\") " Oct 06 16:17:57 crc kubenswrapper[4763]: I1006 16:17:57.542787 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1-kube-api-access-plqbq" (OuterVolumeSpecName: "kube-api-access-plqbq") pod "2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1" (UID: "2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1"). InnerVolumeSpecName "kube-api-access-plqbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:17:57 crc kubenswrapper[4763]: I1006 16:17:57.637459 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plqbq\" (UniqueName: \"kubernetes.io/projected/2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1-kube-api-access-plqbq\") on node \"crc\" DevicePath \"\"" Oct 06 16:17:58 crc kubenswrapper[4763]: I1006 16:17:58.193290 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wgs95" event={"ID":"2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1","Type":"ContainerDied","Data":"b86887e26a9d50ab8bc7b1592a8e7f84a9bcd7cc7ca4fedb353211b0a573b41c"} Oct 06 16:17:58 crc kubenswrapper[4763]: I1006 16:17:58.193336 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b86887e26a9d50ab8bc7b1592a8e7f84a9bcd7cc7ca4fedb353211b0a573b41c" Oct 06 16:17:58 crc kubenswrapper[4763]: I1006 16:17:58.193374 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wgs95" Oct 06 16:18:03 crc kubenswrapper[4763]: I1006 16:18:03.877474 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:18:03 crc kubenswrapper[4763]: I1006 16:18:03.878060 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:18:03 crc kubenswrapper[4763]: I1006 16:18:03.878109 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 16:18:03 crc kubenswrapper[4763]: I1006 16:18:03.878892 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19a7c73fae42a6138f9b916df098fa093d1f8b04e8954f8f59d3dfa210830e42"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 16:18:03 crc kubenswrapper[4763]: I1006 16:18:03.878961 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://19a7c73fae42a6138f9b916df098fa093d1f8b04e8954f8f59d3dfa210830e42" gracePeriod=600 Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.064641 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.272206 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="19a7c73fae42a6138f9b916df098fa093d1f8b04e8954f8f59d3dfa210830e42" exitCode=0 Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.272250 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"19a7c73fae42a6138f9b916df098fa093d1f8b04e8954f8f59d3dfa210830e42"} Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.272274 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b"} Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.272287 4763 scope.go:117] "RemoveContainer" containerID="dc1dec291e0c5030c50ee112f9e78ee9133fa174670fd09a37c65f76261b0b0a" Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.379961 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8e6c-account-create-x6w5b"] Oct 06 16:18:04 crc kubenswrapper[4763]: E1006 16:18:04.380270 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1" containerName="mariadb-database-create" Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.380286 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1" containerName="mariadb-database-create" Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.380457 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1" containerName="mariadb-database-create" Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.380987 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e6c-account-create-x6w5b" Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.382648 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.388964 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8e6c-account-create-x6w5b"] Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.465677 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9lsk\" (UniqueName: \"kubernetes.io/projected/c90efc09-74b5-490d-b52c-a14d53f80fae-kube-api-access-p9lsk\") pod \"keystone-8e6c-account-create-x6w5b\" (UID: \"c90efc09-74b5-490d-b52c-a14d53f80fae\") " pod="openstack/keystone-8e6c-account-create-x6w5b" Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.566837 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9lsk\" (UniqueName: \"kubernetes.io/projected/c90efc09-74b5-490d-b52c-a14d53f80fae-kube-api-access-p9lsk\") pod \"keystone-8e6c-account-create-x6w5b\" (UID: \"c90efc09-74b5-490d-b52c-a14d53f80fae\") " pod="openstack/keystone-8e6c-account-create-x6w5b" Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.595535 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9lsk\" (UniqueName: \"kubernetes.io/projected/c90efc09-74b5-490d-b52c-a14d53f80fae-kube-api-access-p9lsk\") pod \"keystone-8e6c-account-create-x6w5b\" (UID: \"c90efc09-74b5-490d-b52c-a14d53f80fae\") " pod="openstack/keystone-8e6c-account-create-x6w5b" Oct 06 16:18:04 crc kubenswrapper[4763]: I1006 16:18:04.735067 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e6c-account-create-x6w5b" Oct 06 16:18:05 crc kubenswrapper[4763]: I1006 16:18:05.168938 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8e6c-account-create-x6w5b"] Oct 06 16:18:05 crc kubenswrapper[4763]: W1006 16:18:05.176709 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc90efc09_74b5_490d_b52c_a14d53f80fae.slice/crio-c4d475e10574fd9f790ab00fbaf4c312c358443e442889dc2cfee60c5958299f WatchSource:0}: Error finding container c4d475e10574fd9f790ab00fbaf4c312c358443e442889dc2cfee60c5958299f: Status 404 returned error can't find the container with id c4d475e10574fd9f790ab00fbaf4c312c358443e442889dc2cfee60c5958299f Oct 06 16:18:05 crc kubenswrapper[4763]: I1006 16:18:05.283480 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e6c-account-create-x6w5b" event={"ID":"c90efc09-74b5-490d-b52c-a14d53f80fae","Type":"ContainerStarted","Data":"c4d475e10574fd9f790ab00fbaf4c312c358443e442889dc2cfee60c5958299f"} Oct 06 16:18:06 crc kubenswrapper[4763]: I1006 16:18:06.299132 4763 generic.go:334] "Generic (PLEG): container finished" podID="c90efc09-74b5-490d-b52c-a14d53f80fae" containerID="95836c7d627374295ea72b01fc8df22b5af9de1f2d87cb7f1a4f679b13dd1e82" exitCode=0 Oct 06 16:18:06 crc kubenswrapper[4763]: I1006 16:18:06.299259 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e6c-account-create-x6w5b" event={"ID":"c90efc09-74b5-490d-b52c-a14d53f80fae","Type":"ContainerDied","Data":"95836c7d627374295ea72b01fc8df22b5af9de1f2d87cb7f1a4f679b13dd1e82"} Oct 06 16:18:07 crc kubenswrapper[4763]: I1006 16:18:07.692111 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e6c-account-create-x6w5b" Oct 06 16:18:07 crc kubenswrapper[4763]: I1006 16:18:07.821557 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9lsk\" (UniqueName: \"kubernetes.io/projected/c90efc09-74b5-490d-b52c-a14d53f80fae-kube-api-access-p9lsk\") pod \"c90efc09-74b5-490d-b52c-a14d53f80fae\" (UID: \"c90efc09-74b5-490d-b52c-a14d53f80fae\") " Oct 06 16:18:07 crc kubenswrapper[4763]: I1006 16:18:07.828009 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90efc09-74b5-490d-b52c-a14d53f80fae-kube-api-access-p9lsk" (OuterVolumeSpecName: "kube-api-access-p9lsk") pod "c90efc09-74b5-490d-b52c-a14d53f80fae" (UID: "c90efc09-74b5-490d-b52c-a14d53f80fae"). InnerVolumeSpecName "kube-api-access-p9lsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:18:07 crc kubenswrapper[4763]: I1006 16:18:07.923743 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9lsk\" (UniqueName: \"kubernetes.io/projected/c90efc09-74b5-490d-b52c-a14d53f80fae-kube-api-access-p9lsk\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:08 crc kubenswrapper[4763]: I1006 16:18:08.318612 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e6c-account-create-x6w5b" event={"ID":"c90efc09-74b5-490d-b52c-a14d53f80fae","Type":"ContainerDied","Data":"c4d475e10574fd9f790ab00fbaf4c312c358443e442889dc2cfee60c5958299f"} Oct 06 16:18:08 crc kubenswrapper[4763]: I1006 16:18:08.318958 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4d475e10574fd9f790ab00fbaf4c312c358443e442889dc2cfee60c5958299f" Oct 06 16:18:08 crc kubenswrapper[4763]: I1006 16:18:08.318721 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e6c-account-create-x6w5b" Oct 06 16:18:09 crc kubenswrapper[4763]: I1006 16:18:09.958825 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-s7wn2"] Oct 06 16:18:09 crc kubenswrapper[4763]: E1006 16:18:09.959174 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90efc09-74b5-490d-b52c-a14d53f80fae" containerName="mariadb-account-create" Oct 06 16:18:09 crc kubenswrapper[4763]: I1006 16:18:09.959188 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90efc09-74b5-490d-b52c-a14d53f80fae" containerName="mariadb-account-create" Oct 06 16:18:09 crc kubenswrapper[4763]: I1006 16:18:09.959364 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90efc09-74b5-490d-b52c-a14d53f80fae" containerName="mariadb-account-create" Oct 06 16:18:09 crc kubenswrapper[4763]: I1006 16:18:09.959972 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:09 crc kubenswrapper[4763]: I1006 16:18:09.961696 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 16:18:09 crc kubenswrapper[4763]: I1006 16:18:09.961880 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 16:18:09 crc kubenswrapper[4763]: I1006 16:18:09.962929 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m25w5" Oct 06 16:18:09 crc kubenswrapper[4763]: I1006 16:18:09.963176 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 16:18:09 crc kubenswrapper[4763]: I1006 16:18:09.973952 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-s7wn2"] Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.061401 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-combined-ca-bundle\") pod \"keystone-db-sync-s7wn2\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.061680 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-config-data\") pod \"keystone-db-sync-s7wn2\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.061849 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp5qg\" (UniqueName: \"kubernetes.io/projected/09bd4233-a8c5-48aa-b586-28cbf7c52d06-kube-api-access-fp5qg\") pod \"keystone-db-sync-s7wn2\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.164033 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-combined-ca-bundle\") pod \"keystone-db-sync-s7wn2\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.164118 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-config-data\") pod \"keystone-db-sync-s7wn2\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.164167 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp5qg\" (UniqueName: \"kubernetes.io/projected/09bd4233-a8c5-48aa-b586-28cbf7c52d06-kube-api-access-fp5qg\") pod \"keystone-db-sync-s7wn2\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.171341 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-config-data\") pod \"keystone-db-sync-s7wn2\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.173307 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-combined-ca-bundle\") pod \"keystone-db-sync-s7wn2\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.187496 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp5qg\" (UniqueName: \"kubernetes.io/projected/09bd4233-a8c5-48aa-b586-28cbf7c52d06-kube-api-access-fp5qg\") pod \"keystone-db-sync-s7wn2\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.306847 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:10 crc kubenswrapper[4763]: I1006 16:18:10.758740 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-s7wn2"] Oct 06 16:18:10 crc kubenswrapper[4763]: W1006 16:18:10.767894 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09bd4233_a8c5_48aa_b586_28cbf7c52d06.slice/crio-102cf655b95218e924c047644d3ebb9772091897253ffe5697917a145773a875 WatchSource:0}: Error finding container 102cf655b95218e924c047644d3ebb9772091897253ffe5697917a145773a875: Status 404 returned error can't find the container with id 102cf655b95218e924c047644d3ebb9772091897253ffe5697917a145773a875 Oct 06 16:18:11 crc kubenswrapper[4763]: I1006 16:18:11.348388 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s7wn2" event={"ID":"09bd4233-a8c5-48aa-b586-28cbf7c52d06","Type":"ContainerStarted","Data":"3ab2ce31a2ecb94217cb9981df634b19563bea49ecb03e202dbbcd1f98ce06ce"} Oct 06 16:18:11 crc kubenswrapper[4763]: I1006 16:18:11.348481 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s7wn2" event={"ID":"09bd4233-a8c5-48aa-b586-28cbf7c52d06","Type":"ContainerStarted","Data":"102cf655b95218e924c047644d3ebb9772091897253ffe5697917a145773a875"} Oct 06 16:18:11 crc kubenswrapper[4763]: I1006 16:18:11.370115 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-s7wn2" podStartSLOduration=2.370088169 podStartE2EDuration="2.370088169s" podCreationTimestamp="2025-10-06 16:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:18:11.363079121 +0000 UTC m=+5088.518371623" watchObservedRunningTime="2025-10-06 16:18:11.370088169 +0000 UTC m=+5088.525380721" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.353058 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2rqfm"] Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.363880 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.364457 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rqfm"] Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.390504 4763 generic.go:334] "Generic (PLEG): container finished" podID="09bd4233-a8c5-48aa-b586-28cbf7c52d06" containerID="3ab2ce31a2ecb94217cb9981df634b19563bea49ecb03e202dbbcd1f98ce06ce" exitCode=0 Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.390567 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s7wn2" event={"ID":"09bd4233-a8c5-48aa-b586-28cbf7c52d06","Type":"ContainerDied","Data":"3ab2ce31a2ecb94217cb9981df634b19563bea49ecb03e202dbbcd1f98ce06ce"} Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.522453 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-catalog-content\") pod \"redhat-marketplace-2rqfm\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.522672 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz4xb\" (UniqueName: \"kubernetes.io/projected/29bbf759-a445-4c0f-b654-bc1be03f8d94-kube-api-access-tz4xb\") pod \"redhat-marketplace-2rqfm\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.522936 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-utilities\") pod \"redhat-marketplace-2rqfm\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.624036 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-catalog-content\") pod \"redhat-marketplace-2rqfm\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.624146 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz4xb\" (UniqueName: \"kubernetes.io/projected/29bbf759-a445-4c0f-b654-bc1be03f8d94-kube-api-access-tz4xb\") pod \"redhat-marketplace-2rqfm\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.624226 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-utilities\") pod \"redhat-marketplace-2rqfm\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.624454 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-catalog-content\") pod \"redhat-marketplace-2rqfm\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.624570 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-utilities\") pod \"redhat-marketplace-2rqfm\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.651823 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz4xb\" (UniqueName: \"kubernetes.io/projected/29bbf759-a445-4c0f-b654-bc1be03f8d94-kube-api-access-tz4xb\") pod \"redhat-marketplace-2rqfm\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:13 crc kubenswrapper[4763]: I1006 16:18:13.696199 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.258912 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rqfm"] Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.400291 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rqfm" event={"ID":"29bbf759-a445-4c0f-b654-bc1be03f8d94","Type":"ContainerStarted","Data":"dce51647f9f82e67f3137436b365ea7c4081279683d9393b917bd6c55288ac44"} Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.757964 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.849537 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-combined-ca-bundle\") pod \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.849654 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-config-data\") pod \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.849897 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp5qg\" (UniqueName: \"kubernetes.io/projected/09bd4233-a8c5-48aa-b586-28cbf7c52d06-kube-api-access-fp5qg\") pod \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\" (UID: \"09bd4233-a8c5-48aa-b586-28cbf7c52d06\") " Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.857209 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09bd4233-a8c5-48aa-b586-28cbf7c52d06-kube-api-access-fp5qg" (OuterVolumeSpecName: "kube-api-access-fp5qg") pod "09bd4233-a8c5-48aa-b586-28cbf7c52d06" (UID: "09bd4233-a8c5-48aa-b586-28cbf7c52d06"). InnerVolumeSpecName "kube-api-access-fp5qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.905810 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09bd4233-a8c5-48aa-b586-28cbf7c52d06" (UID: "09bd4233-a8c5-48aa-b586-28cbf7c52d06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.918528 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-config-data" (OuterVolumeSpecName: "config-data") pod "09bd4233-a8c5-48aa-b586-28cbf7c52d06" (UID: "09bd4233-a8c5-48aa-b586-28cbf7c52d06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.952046 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp5qg\" (UniqueName: \"kubernetes.io/projected/09bd4233-a8c5-48aa-b586-28cbf7c52d06-kube-api-access-fp5qg\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.952094 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:14 crc kubenswrapper[4763]: I1006 16:18:14.952114 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09bd4233-a8c5-48aa-b586-28cbf7c52d06-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.418815 4763 generic.go:334] "Generic (PLEG): container finished" podID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerID="e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886" exitCode=0 Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.418896 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rqfm" event={"ID":"29bbf759-a445-4c0f-b654-bc1be03f8d94","Type":"ContainerDied","Data":"e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886"} Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.421025 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s7wn2" event={"ID":"09bd4233-a8c5-48aa-b586-28cbf7c52d06","Type":"ContainerDied","Data":"102cf655b95218e924c047644d3ebb9772091897253ffe5697917a145773a875"} Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.421052 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="102cf655b95218e924c047644d3ebb9772091897253ffe5697917a145773a875" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.421082 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s7wn2" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.663430 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-vc6dc"] Oct 06 16:18:15 crc kubenswrapper[4763]: E1006 16:18:15.664190 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09bd4233-a8c5-48aa-b586-28cbf7c52d06" containerName="keystone-db-sync" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.664210 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="09bd4233-a8c5-48aa-b586-28cbf7c52d06" containerName="keystone-db-sync" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.664438 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="09bd4233-a8c5-48aa-b586-28cbf7c52d06" containerName="keystone-db-sync" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.665461 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.692547 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-vc6dc"] Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.716685 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xqr56"] Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.717759 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.722403 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.722450 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m25w5" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.722404 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.722786 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.727050 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xqr56"] Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.764500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-dns-svc\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.764577 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-sb\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.764680 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-nb\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.764789 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmv4b\" (UniqueName: \"kubernetes.io/projected/abbbc32d-99b3-4a65-951b-11aa0daaefd5-kube-api-access-vmv4b\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.764833 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-config\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.866823 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtqh7\" (UniqueName: \"kubernetes.io/projected/bfad278b-9ccd-4d8e-973c-b173a79dc47a-kube-api-access-dtqh7\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.866931 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-dns-svc\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.866975 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-sb\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.867023 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-combined-ca-bundle\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.867046 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-fernet-keys\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.867066 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-credential-keys\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.867143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-nb\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.867202 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmv4b\" (UniqueName: \"kubernetes.io/projected/abbbc32d-99b3-4a65-951b-11aa0daaefd5-kube-api-access-vmv4b\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.867260 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-config\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.867301 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-config-data\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.867349 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-scripts\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.868057 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-dns-svc\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.868801 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-sb\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.868866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-nb\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.869514 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-config\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.890234 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmv4b\" (UniqueName: \"kubernetes.io/projected/abbbc32d-99b3-4a65-951b-11aa0daaefd5-kube-api-access-vmv4b\") pod \"dnsmasq-dns-7485969d9c-vc6dc\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.968452 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-config-data\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.968499 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-scripts\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.968561 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtqh7\" (UniqueName: \"kubernetes.io/projected/bfad278b-9ccd-4d8e-973c-b173a79dc47a-kube-api-access-dtqh7\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.968631 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-combined-ca-bundle\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.968650 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-fernet-keys\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.968669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-credential-keys\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.971658 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-scripts\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.971902 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-credential-keys\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.972157 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-config-data\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.973009 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-fernet-keys\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.977683 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-combined-ca-bundle\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.986049 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:15 crc kubenswrapper[4763]: I1006 16:18:15.993255 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtqh7\" (UniqueName: \"kubernetes.io/projected/bfad278b-9ccd-4d8e-973c-b173a79dc47a-kube-api-access-dtqh7\") pod \"keystone-bootstrap-xqr56\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:16 crc kubenswrapper[4763]: I1006 16:18:16.051004 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:16 crc kubenswrapper[4763]: I1006 16:18:16.430163 4763 generic.go:334] "Generic (PLEG): container finished" podID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerID="b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df" exitCode=0 Oct 06 16:18:16 crc kubenswrapper[4763]: I1006 16:18:16.430410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rqfm" event={"ID":"29bbf759-a445-4c0f-b654-bc1be03f8d94","Type":"ContainerDied","Data":"b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df"} Oct 06 16:18:16 crc kubenswrapper[4763]: I1006 16:18:16.441424 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-vc6dc"] Oct 06 16:18:16 crc kubenswrapper[4763]: I1006 16:18:16.534084 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xqr56"] Oct 06 16:18:17 crc kubenswrapper[4763]: I1006 16:18:17.440394 4763 generic.go:334] "Generic (PLEG): container finished" podID="abbbc32d-99b3-4a65-951b-11aa0daaefd5" containerID="f3a66e8def67ce19bd7428ccd312d6e003ed9a7f9ae0e85fc58c0ea42e6b2bcb" exitCode=0 Oct 06 16:18:17 crc kubenswrapper[4763]: I1006 16:18:17.440437 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" event={"ID":"abbbc32d-99b3-4a65-951b-11aa0daaefd5","Type":"ContainerDied","Data":"f3a66e8def67ce19bd7428ccd312d6e003ed9a7f9ae0e85fc58c0ea42e6b2bcb"} Oct 06 16:18:17 crc kubenswrapper[4763]: I1006 16:18:17.441525 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" event={"ID":"abbbc32d-99b3-4a65-951b-11aa0daaefd5","Type":"ContainerStarted","Data":"a1b390e8850dd91cc3d2cc3754fc9ec10b4e7951f2a3e7b0f97cf6294fce6955"} Oct 06 16:18:17 crc kubenswrapper[4763]: I1006 16:18:17.444318 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqr56" event={"ID":"bfad278b-9ccd-4d8e-973c-b173a79dc47a","Type":"ContainerStarted","Data":"24b9a114c51fa39b3cbd76bddf067565d33db89070eff3940cf846b080a10460"} Oct 06 16:18:17 crc kubenswrapper[4763]: I1006 16:18:17.444432 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqr56" event={"ID":"bfad278b-9ccd-4d8e-973c-b173a79dc47a","Type":"ContainerStarted","Data":"148c093ac6c91316c826db8f0045a180c4a1bb9b91c44329f20db663322e1c1b"} Oct 06 16:18:17 crc kubenswrapper[4763]: I1006 16:18:17.457228 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rqfm" event={"ID":"29bbf759-a445-4c0f-b654-bc1be03f8d94","Type":"ContainerStarted","Data":"a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843"} Oct 06 16:18:17 crc kubenswrapper[4763]: I1006 16:18:17.487147 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2rqfm" podStartSLOduration=2.7363260719999998 podStartE2EDuration="4.48710852s" podCreationTimestamp="2025-10-06 16:18:13 +0000 UTC" firstStartedPulling="2025-10-06 16:18:15.421720983 +0000 UTC m=+5092.577013525" lastFinishedPulling="2025-10-06 16:18:17.172503451 +0000 UTC m=+5094.327795973" observedRunningTime="2025-10-06 16:18:17.485053865 +0000 UTC m=+5094.640346387" watchObservedRunningTime="2025-10-06 16:18:17.48710852 +0000 UTC m=+5094.642401032" Oct 06 16:18:17 crc kubenswrapper[4763]: I1006 16:18:17.510344 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xqr56" podStartSLOduration=2.510326792 podStartE2EDuration="2.510326792s" podCreationTimestamp="2025-10-06 16:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:18:17.505222346 +0000 UTC m=+5094.660514858" watchObservedRunningTime="2025-10-06 16:18:17.510326792 +0000 UTC m=+5094.665619304" Oct 06 16:18:18 crc kubenswrapper[4763]: I1006 16:18:18.474445 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" event={"ID":"abbbc32d-99b3-4a65-951b-11aa0daaefd5","Type":"ContainerStarted","Data":"82cd5a691147581f75d57d268c2b94f08ff9258b116c1085e83508ddfac252a3"} Oct 06 16:18:19 crc kubenswrapper[4763]: I1006 16:18:19.483922 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:20 crc kubenswrapper[4763]: I1006 16:18:20.497491 4763 generic.go:334] "Generic (PLEG): container finished" podID="bfad278b-9ccd-4d8e-973c-b173a79dc47a" containerID="24b9a114c51fa39b3cbd76bddf067565d33db89070eff3940cf846b080a10460" exitCode=0 Oct 06 16:18:20 crc kubenswrapper[4763]: I1006 16:18:20.497661 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqr56" event={"ID":"bfad278b-9ccd-4d8e-973c-b173a79dc47a","Type":"ContainerDied","Data":"24b9a114c51fa39b3cbd76bddf067565d33db89070eff3940cf846b080a10460"} Oct 06 16:18:20 crc kubenswrapper[4763]: I1006 16:18:20.524973 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" podStartSLOduration=5.524948842 podStartE2EDuration="5.524948842s" podCreationTimestamp="2025-10-06 16:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:18:18.501492469 +0000 UTC m=+5095.656785011" watchObservedRunningTime="2025-10-06 16:18:20.524948842 +0000 UTC m=+5097.680241384" Oct 06 16:18:21 crc kubenswrapper[4763]: I1006 16:18:21.901506 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.073259 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-fernet-keys\") pod \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.073399 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-scripts\") pod \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.073458 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-combined-ca-bundle\") pod \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.073500 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-config-data\") pod \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.073544 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtqh7\" (UniqueName: \"kubernetes.io/projected/bfad278b-9ccd-4d8e-973c-b173a79dc47a-kube-api-access-dtqh7\") pod \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.073595 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-credential-keys\") pod \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\" (UID: \"bfad278b-9ccd-4d8e-973c-b173a79dc47a\") " Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.079986 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bfad278b-9ccd-4d8e-973c-b173a79dc47a" (UID: "bfad278b-9ccd-4d8e-973c-b173a79dc47a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.080032 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfad278b-9ccd-4d8e-973c-b173a79dc47a-kube-api-access-dtqh7" (OuterVolumeSpecName: "kube-api-access-dtqh7") pod "bfad278b-9ccd-4d8e-973c-b173a79dc47a" (UID: "bfad278b-9ccd-4d8e-973c-b173a79dc47a"). InnerVolumeSpecName "kube-api-access-dtqh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.080151 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bfad278b-9ccd-4d8e-973c-b173a79dc47a" (UID: "bfad278b-9ccd-4d8e-973c-b173a79dc47a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.103248 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-scripts" (OuterVolumeSpecName: "scripts") pod "bfad278b-9ccd-4d8e-973c-b173a79dc47a" (UID: "bfad278b-9ccd-4d8e-973c-b173a79dc47a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.110018 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfad278b-9ccd-4d8e-973c-b173a79dc47a" (UID: "bfad278b-9ccd-4d8e-973c-b173a79dc47a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.133188 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-config-data" (OuterVolumeSpecName: "config-data") pod "bfad278b-9ccd-4d8e-973c-b173a79dc47a" (UID: "bfad278b-9ccd-4d8e-973c-b173a79dc47a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.176072 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.176127 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.176148 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.176172 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtqh7\" (UniqueName: \"kubernetes.io/projected/bfad278b-9ccd-4d8e-973c-b173a79dc47a-kube-api-access-dtqh7\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.176191 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.176207 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfad278b-9ccd-4d8e-973c-b173a79dc47a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.519065 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqr56" event={"ID":"bfad278b-9ccd-4d8e-973c-b173a79dc47a","Type":"ContainerDied","Data":"148c093ac6c91316c826db8f0045a180c4a1bb9b91c44329f20db663322e1c1b"} Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.519111 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="148c093ac6c91316c826db8f0045a180c4a1bb9b91c44329f20db663322e1c1b" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.519165 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqr56" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.600245 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xqr56"] Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.608065 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xqr56"] Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.682961 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-znxl7"] Oct 06 16:18:22 crc kubenswrapper[4763]: E1006 16:18:22.683278 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfad278b-9ccd-4d8e-973c-b173a79dc47a" containerName="keystone-bootstrap" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.683296 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfad278b-9ccd-4d8e-973c-b173a79dc47a" containerName="keystone-bootstrap" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.683480 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfad278b-9ccd-4d8e-973c-b173a79dc47a" containerName="keystone-bootstrap" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.685517 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.687515 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.691541 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m25w5" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.692325 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.692908 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.696412 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-znxl7"] Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.793519 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bmln\" (UniqueName: \"kubernetes.io/projected/c967d347-b476-40be-8c24-7088d6ee6630-kube-api-access-7bmln\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.793676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-combined-ca-bundle\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.793807 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-credential-keys\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.793853 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-scripts\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.794040 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-fernet-keys\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.794215 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-config-data\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.896211 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bmln\" (UniqueName: \"kubernetes.io/projected/c967d347-b476-40be-8c24-7088d6ee6630-kube-api-access-7bmln\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.896605 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-combined-ca-bundle\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.896768 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-credential-keys\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.896910 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-scripts\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.897024 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-fernet-keys\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.897161 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-config-data\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.902008 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-credential-keys\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.902064 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-combined-ca-bundle\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.902389 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-config-data\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.902697 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-fernet-keys\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.902991 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-scripts\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:22 crc kubenswrapper[4763]: I1006 16:18:22.914654 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bmln\" (UniqueName: \"kubernetes.io/projected/c967d347-b476-40be-8c24-7088d6ee6630-kube-api-access-7bmln\") pod \"keystone-bootstrap-znxl7\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:23 crc kubenswrapper[4763]: I1006 16:18:23.061054 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:23 crc kubenswrapper[4763]: I1006 16:18:23.606717 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfad278b-9ccd-4d8e-973c-b173a79dc47a" path="/var/lib/kubelet/pods/bfad278b-9ccd-4d8e-973c-b173a79dc47a/volumes" Oct 06 16:18:23 crc kubenswrapper[4763]: I1006 16:18:23.654503 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-znxl7"] Oct 06 16:18:23 crc kubenswrapper[4763]: I1006 16:18:23.701958 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:23 crc kubenswrapper[4763]: I1006 16:18:23.702008 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:23 crc kubenswrapper[4763]: I1006 16:18:23.768607 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:24 crc kubenswrapper[4763]: I1006 16:18:24.538813 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-znxl7" event={"ID":"c967d347-b476-40be-8c24-7088d6ee6630","Type":"ContainerStarted","Data":"749c7a68ebd2e3de68a7ce7596666947ddc5449c59454de76585158b70182d6f"} Oct 06 16:18:24 crc kubenswrapper[4763]: I1006 16:18:24.538853 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-znxl7" event={"ID":"c967d347-b476-40be-8c24-7088d6ee6630","Type":"ContainerStarted","Data":"16ed422e6bee471d74c601820c71190147267392ed4f328750daba1ba44a1cc5"} Oct 06 16:18:24 crc kubenswrapper[4763]: I1006 16:18:24.560744 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-znxl7" podStartSLOduration=2.560723682 podStartE2EDuration="2.560723682s" podCreationTimestamp="2025-10-06 16:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:18:24.559099678 +0000 UTC m=+5101.714392220" watchObservedRunningTime="2025-10-06 16:18:24.560723682 +0000 UTC m=+5101.716016204" Oct 06 16:18:24 crc kubenswrapper[4763]: I1006 16:18:24.611872 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:24 crc kubenswrapper[4763]: I1006 16:18:24.664929 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rqfm"] Oct 06 16:18:25 crc kubenswrapper[4763]: I1006 16:18:25.987931 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.098325 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-2fvtj"] Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.098525 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" podUID="0a2b2192-1baf-4aef-929c-908e4b0afd12" containerName="dnsmasq-dns" containerID="cri-o://ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885" gracePeriod=10 Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.553907 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.553980 4763 generic.go:334] "Generic (PLEG): container finished" podID="0a2b2192-1baf-4aef-929c-908e4b0afd12" containerID="ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885" exitCode=0 Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.554003 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" event={"ID":"0a2b2192-1baf-4aef-929c-908e4b0afd12","Type":"ContainerDied","Data":"ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885"} Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.554505 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" event={"ID":"0a2b2192-1baf-4aef-929c-908e4b0afd12","Type":"ContainerDied","Data":"905d78a514cc5d3f81b322b789c1b3385a07035714e56b85710b75c0d9f11095"} Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.554538 4763 scope.go:117] "RemoveContainer" containerID="ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.556938 4763 generic.go:334] "Generic (PLEG): container finished" podID="c967d347-b476-40be-8c24-7088d6ee6630" containerID="749c7a68ebd2e3de68a7ce7596666947ddc5449c59454de76585158b70182d6f" exitCode=0 Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.557034 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-znxl7" event={"ID":"c967d347-b476-40be-8c24-7088d6ee6630","Type":"ContainerDied","Data":"749c7a68ebd2e3de68a7ce7596666947ddc5449c59454de76585158b70182d6f"} Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.557188 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2rqfm" podUID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerName="registry-server" containerID="cri-o://a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843" gracePeriod=2 Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.589207 4763 scope.go:117] "RemoveContainer" containerID="c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.657963 4763 scope.go:117] "RemoveContainer" containerID="ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885" Oct 06 16:18:26 crc kubenswrapper[4763]: E1006 16:18:26.658445 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885\": container with ID starting with ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885 not found: ID does not exist" containerID="ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.658516 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885"} err="failed to get container status \"ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885\": rpc error: code = NotFound desc = could not find container \"ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885\": container with ID starting with ede5b45c5e72903d310a8c0e87f1492a5a4b304e0278462ac7a25245fa475885 not found: ID does not exist" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.658554 4763 scope.go:117] "RemoveContainer" containerID="c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad" Oct 06 16:18:26 crc kubenswrapper[4763]: E1006 16:18:26.658995 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad\": container with ID starting with c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad not found: ID does not exist" containerID="c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.659032 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad"} err="failed to get container status \"c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad\": rpc error: code = NotFound desc = could not find container \"c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad\": container with ID starting with c88dd771503594831cbcc50fe72c22826f40842fc0df2f543650bd140a6d16ad not found: ID does not exist" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.702074 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-config\") pod \"0a2b2192-1baf-4aef-929c-908e4b0afd12\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.702203 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-sb\") pod \"0a2b2192-1baf-4aef-929c-908e4b0afd12\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.702239 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-nb\") pod \"0a2b2192-1baf-4aef-929c-908e4b0afd12\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.702322 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbpgv\" (UniqueName: \"kubernetes.io/projected/0a2b2192-1baf-4aef-929c-908e4b0afd12-kube-api-access-bbpgv\") pod \"0a2b2192-1baf-4aef-929c-908e4b0afd12\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.702404 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-dns-svc\") pod \"0a2b2192-1baf-4aef-929c-908e4b0afd12\" (UID: \"0a2b2192-1baf-4aef-929c-908e4b0afd12\") " Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.709436 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2b2192-1baf-4aef-929c-908e4b0afd12-kube-api-access-bbpgv" (OuterVolumeSpecName: "kube-api-access-bbpgv") pod "0a2b2192-1baf-4aef-929c-908e4b0afd12" (UID: "0a2b2192-1baf-4aef-929c-908e4b0afd12"). InnerVolumeSpecName "kube-api-access-bbpgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.748167 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a2b2192-1baf-4aef-929c-908e4b0afd12" (UID: "0a2b2192-1baf-4aef-929c-908e4b0afd12"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.755998 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a2b2192-1baf-4aef-929c-908e4b0afd12" (UID: "0a2b2192-1baf-4aef-929c-908e4b0afd12"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.761701 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-config" (OuterVolumeSpecName: "config") pod "0a2b2192-1baf-4aef-929c-908e4b0afd12" (UID: "0a2b2192-1baf-4aef-929c-908e4b0afd12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.782047 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a2b2192-1baf-4aef-929c-908e4b0afd12" (UID: "0a2b2192-1baf-4aef-929c-908e4b0afd12"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.804118 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.804156 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.804168 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbpgv\" (UniqueName: \"kubernetes.io/projected/0a2b2192-1baf-4aef-929c-908e4b0afd12-kube-api-access-bbpgv\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.804183 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.804193 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2b2192-1baf-4aef-929c-908e4b0afd12-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:26 crc kubenswrapper[4763]: I1006 16:18:26.904259 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.005534 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-utilities\") pod \"29bbf759-a445-4c0f-b654-bc1be03f8d94\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.005574 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz4xb\" (UniqueName: \"kubernetes.io/projected/29bbf759-a445-4c0f-b654-bc1be03f8d94-kube-api-access-tz4xb\") pod \"29bbf759-a445-4c0f-b654-bc1be03f8d94\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.005701 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-catalog-content\") pod \"29bbf759-a445-4c0f-b654-bc1be03f8d94\" (UID: \"29bbf759-a445-4c0f-b654-bc1be03f8d94\") " Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.006664 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-utilities" (OuterVolumeSpecName: "utilities") pod "29bbf759-a445-4c0f-b654-bc1be03f8d94" (UID: "29bbf759-a445-4c0f-b654-bc1be03f8d94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.008891 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29bbf759-a445-4c0f-b654-bc1be03f8d94-kube-api-access-tz4xb" (OuterVolumeSpecName: "kube-api-access-tz4xb") pod "29bbf759-a445-4c0f-b654-bc1be03f8d94" (UID: "29bbf759-a445-4c0f-b654-bc1be03f8d94"). InnerVolumeSpecName "kube-api-access-tz4xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.023466 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29bbf759-a445-4c0f-b654-bc1be03f8d94" (UID: "29bbf759-a445-4c0f-b654-bc1be03f8d94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.108463 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.108518 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz4xb\" (UniqueName: \"kubernetes.io/projected/29bbf759-a445-4c0f-b654-bc1be03f8d94-kube-api-access-tz4xb\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.108540 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29bbf759-a445-4c0f-b654-bc1be03f8d94-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.571536 4763 generic.go:334] "Generic (PLEG): container finished" podID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerID="a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843" exitCode=0 Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.571691 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rqfm" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.571689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rqfm" event={"ID":"29bbf759-a445-4c0f-b654-bc1be03f8d94","Type":"ContainerDied","Data":"a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843"} Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.572252 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rqfm" event={"ID":"29bbf759-a445-4c0f-b654-bc1be03f8d94","Type":"ContainerDied","Data":"dce51647f9f82e67f3137436b365ea7c4081279683d9393b917bd6c55288ac44"} Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.572287 4763 scope.go:117] "RemoveContainer" containerID="a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.574917 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c54468fdc-2fvtj" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.625979 4763 scope.go:117] "RemoveContainer" containerID="b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.652209 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rqfm"] Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.660396 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rqfm"] Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.669534 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-2fvtj"] Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.672034 4763 scope.go:117] "RemoveContainer" containerID="e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.675972 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-2fvtj"] Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.744481 4763 scope.go:117] "RemoveContainer" containerID="a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843" Oct 06 16:18:27 crc kubenswrapper[4763]: E1006 16:18:27.745129 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843\": container with ID starting with a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843 not found: ID does not exist" containerID="a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.745360 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843"} err="failed to get container status \"a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843\": rpc error: code = NotFound desc = could not find container \"a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843\": container with ID starting with a182aa00ddd021a827d8fae6850fc08a510454d178c38b7f0fca9e3093db7843 not found: ID does not exist" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.745522 4763 scope.go:117] "RemoveContainer" containerID="b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df" Oct 06 16:18:27 crc kubenswrapper[4763]: E1006 16:18:27.746050 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df\": container with ID starting with b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df not found: ID does not exist" containerID="b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.746096 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df"} err="failed to get container status \"b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df\": rpc error: code = NotFound desc = could not find container \"b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df\": container with ID starting with b05eee7d20e3c0959898d7c49a9dcd79faaa9571bd66ece31ec317c7fcc8d9df not found: ID does not exist" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.746120 4763 scope.go:117] "RemoveContainer" containerID="e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886" Oct 06 16:18:27 crc kubenswrapper[4763]: E1006 16:18:27.747523 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886\": container with ID starting with e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886 not found: ID does not exist" containerID="e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.747560 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886"} err="failed to get container status \"e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886\": rpc error: code = NotFound desc = could not find container \"e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886\": container with ID starting with e4258a520e790a7869085cdcf1739f64ffc4acc03163aea5c86c169e6e4b0886 not found: ID does not exist" Oct 06 16:18:27 crc kubenswrapper[4763]: I1006 16:18:27.992807 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.128714 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bmln\" (UniqueName: \"kubernetes.io/projected/c967d347-b476-40be-8c24-7088d6ee6630-kube-api-access-7bmln\") pod \"c967d347-b476-40be-8c24-7088d6ee6630\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.128763 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-config-data\") pod \"c967d347-b476-40be-8c24-7088d6ee6630\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.128801 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-fernet-keys\") pod \"c967d347-b476-40be-8c24-7088d6ee6630\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.128823 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-credential-keys\") pod \"c967d347-b476-40be-8c24-7088d6ee6630\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.128873 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-scripts\") pod \"c967d347-b476-40be-8c24-7088d6ee6630\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.129019 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-combined-ca-bundle\") pod \"c967d347-b476-40be-8c24-7088d6ee6630\" (UID: \"c967d347-b476-40be-8c24-7088d6ee6630\") " Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.140828 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-scripts" (OuterVolumeSpecName: "scripts") pod "c967d347-b476-40be-8c24-7088d6ee6630" (UID: "c967d347-b476-40be-8c24-7088d6ee6630"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.141242 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c967d347-b476-40be-8c24-7088d6ee6630-kube-api-access-7bmln" (OuterVolumeSpecName: "kube-api-access-7bmln") pod "c967d347-b476-40be-8c24-7088d6ee6630" (UID: "c967d347-b476-40be-8c24-7088d6ee6630"). InnerVolumeSpecName "kube-api-access-7bmln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.142804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c967d347-b476-40be-8c24-7088d6ee6630" (UID: "c967d347-b476-40be-8c24-7088d6ee6630"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.142904 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c967d347-b476-40be-8c24-7088d6ee6630" (UID: "c967d347-b476-40be-8c24-7088d6ee6630"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.176783 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c967d347-b476-40be-8c24-7088d6ee6630" (UID: "c967d347-b476-40be-8c24-7088d6ee6630"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.183800 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-config-data" (OuterVolumeSpecName: "config-data") pod "c967d347-b476-40be-8c24-7088d6ee6630" (UID: "c967d347-b476-40be-8c24-7088d6ee6630"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.230764 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.230796 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bmln\" (UniqueName: \"kubernetes.io/projected/c967d347-b476-40be-8c24-7088d6ee6630-kube-api-access-7bmln\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.230811 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.230823 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.230835 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.230845 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c967d347-b476-40be-8c24-7088d6ee6630-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.585877 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-znxl7" event={"ID":"c967d347-b476-40be-8c24-7088d6ee6630","Type":"ContainerDied","Data":"16ed422e6bee471d74c601820c71190147267392ed4f328750daba1ba44a1cc5"} Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.585955 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16ed422e6bee471d74c601820c71190147267392ed4f328750daba1ba44a1cc5" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.585982 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-znxl7" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.695202 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b9d95cb55-7cm7p"] Oct 06 16:18:28 crc kubenswrapper[4763]: E1006 16:18:28.695535 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c967d347-b476-40be-8c24-7088d6ee6630" containerName="keystone-bootstrap" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.695547 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c967d347-b476-40be-8c24-7088d6ee6630" containerName="keystone-bootstrap" Oct 06 16:18:28 crc kubenswrapper[4763]: E1006 16:18:28.695560 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerName="registry-server" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.695566 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerName="registry-server" Oct 06 16:18:28 crc kubenswrapper[4763]: E1006 16:18:28.695586 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2b2192-1baf-4aef-929c-908e4b0afd12" containerName="dnsmasq-dns" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.695594 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2b2192-1baf-4aef-929c-908e4b0afd12" containerName="dnsmasq-dns" Oct 06 16:18:28 crc kubenswrapper[4763]: E1006 16:18:28.695604 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerName="extract-utilities" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.695640 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerName="extract-utilities" Oct 06 16:18:28 crc kubenswrapper[4763]: E1006 16:18:28.695648 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2b2192-1baf-4aef-929c-908e4b0afd12" containerName="init" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.695653 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2b2192-1baf-4aef-929c-908e4b0afd12" containerName="init" Oct 06 16:18:28 crc kubenswrapper[4763]: E1006 16:18:28.695670 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerName="extract-content" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.695676 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerName="extract-content" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.695835 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="29bbf759-a445-4c0f-b654-bc1be03f8d94" containerName="registry-server" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.695851 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c967d347-b476-40be-8c24-7088d6ee6630" containerName="keystone-bootstrap" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.695864 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2b2192-1baf-4aef-929c-908e4b0afd12" containerName="dnsmasq-dns" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.696402 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.698420 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.698764 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.698811 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m25w5" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.698852 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.714362 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b9d95cb55-7cm7p"] Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.841431 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrsz9\" (UniqueName: \"kubernetes.io/projected/d31fbaf3-0f27-4eec-904f-d5d23184a12d-kube-api-access-jrsz9\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.841534 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-credential-keys\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.841714 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-combined-ca-bundle\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.841742 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-config-data\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.841767 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-fernet-keys\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.842161 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-scripts\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.943911 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-combined-ca-bundle\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.943967 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-config-data\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.943997 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-fernet-keys\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.944022 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-scripts\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.944048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrsz9\" (UniqueName: \"kubernetes.io/projected/d31fbaf3-0f27-4eec-904f-d5d23184a12d-kube-api-access-jrsz9\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.944108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-credential-keys\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.947907 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-scripts\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.947946 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-credential-keys\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.949165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-fernet-keys\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.949547 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-config-data\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.949565 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31fbaf3-0f27-4eec-904f-d5d23184a12d-combined-ca-bundle\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:28 crc kubenswrapper[4763]: I1006 16:18:28.962760 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrsz9\" (UniqueName: \"kubernetes.io/projected/d31fbaf3-0f27-4eec-904f-d5d23184a12d-kube-api-access-jrsz9\") pod \"keystone-b9d95cb55-7cm7p\" (UID: \"d31fbaf3-0f27-4eec-904f-d5d23184a12d\") " pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:29 crc kubenswrapper[4763]: I1006 16:18:29.023863 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:29 crc kubenswrapper[4763]: I1006 16:18:29.289982 4763 scope.go:117] "RemoveContainer" containerID="9da989787b4d480bbae672c943b007c6a7488de37cac898f17a23a1e8beacb87" Oct 06 16:18:29 crc kubenswrapper[4763]: I1006 16:18:29.306059 4763 scope.go:117] "RemoveContainer" containerID="e891ede40f6595ded8bfd1aec0bf740feb8834f22c0681135b7c2a8120eecbfc" Oct 06 16:18:29 crc kubenswrapper[4763]: W1006 16:18:29.481308 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd31fbaf3_0f27_4eec_904f_d5d23184a12d.slice/crio-cedd7c342a1a51509f035d203762d1e78b31cec4ec936b714b6c75d71fa3852d WatchSource:0}: Error finding container cedd7c342a1a51509f035d203762d1e78b31cec4ec936b714b6c75d71fa3852d: Status 404 returned error can't find the container with id cedd7c342a1a51509f035d203762d1e78b31cec4ec936b714b6c75d71fa3852d Oct 06 16:18:29 crc kubenswrapper[4763]: I1006 16:18:29.482356 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b9d95cb55-7cm7p"] Oct 06 16:18:29 crc kubenswrapper[4763]: I1006 16:18:29.588463 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a2b2192-1baf-4aef-929c-908e4b0afd12" path="/var/lib/kubelet/pods/0a2b2192-1baf-4aef-929c-908e4b0afd12/volumes" Oct 06 16:18:29 crc kubenswrapper[4763]: I1006 16:18:29.589172 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29bbf759-a445-4c0f-b654-bc1be03f8d94" path="/var/lib/kubelet/pods/29bbf759-a445-4c0f-b654-bc1be03f8d94/volumes" Oct 06 16:18:29 crc kubenswrapper[4763]: I1006 16:18:29.594223 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b9d95cb55-7cm7p" event={"ID":"d31fbaf3-0f27-4eec-904f-d5d23184a12d","Type":"ContainerStarted","Data":"cedd7c342a1a51509f035d203762d1e78b31cec4ec936b714b6c75d71fa3852d"} Oct 06 16:18:30 crc kubenswrapper[4763]: I1006 16:18:30.608803 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b9d95cb55-7cm7p" event={"ID":"d31fbaf3-0f27-4eec-904f-d5d23184a12d","Type":"ContainerStarted","Data":"088b9a277391fd8291350227ae6153691d54950b2fa88260b30aed0b26565118"} Oct 06 16:18:30 crc kubenswrapper[4763]: I1006 16:18:30.609488 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:18:30 crc kubenswrapper[4763]: I1006 16:18:30.643420 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b9d95cb55-7cm7p" podStartSLOduration=2.6434028229999997 podStartE2EDuration="2.643402823s" podCreationTimestamp="2025-10-06 16:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:18:30.625406911 +0000 UTC m=+5107.780699433" watchObservedRunningTime="2025-10-06 16:18:30.643402823 +0000 UTC m=+5107.798695335" Oct 06 16:19:00 crc kubenswrapper[4763]: I1006 16:19:00.503524 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-b9d95cb55-7cm7p" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.465182 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.466682 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.470067 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-cjdh5" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.470114 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.471546 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.488556 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.551764 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 06 16:19:04 crc kubenswrapper[4763]: E1006 16:19:04.554365 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-2p4sd openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="537678d2-6b4a-4485-a6b4-3afc1f5ee527" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.557896 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.594744 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.596088 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.605734 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.609705 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config\") pod \"openstackclient\" (UID: \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.609867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config-secret\") pod \"openstackclient\" (UID: \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.610004 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p4sd\" (UniqueName: \"kubernetes.io/projected/537678d2-6b4a-4485-a6b4-3afc1f5ee527-kube-api-access-2p4sd\") pod \"openstackclient\" (UID: \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.710992 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config\") pod \"openstackclient\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.711277 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config-secret\") pod \"openstackclient\" (UID: \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.711350 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config-secret\") pod \"openstackclient\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.711377 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p4sd\" (UniqueName: \"kubernetes.io/projected/537678d2-6b4a-4485-a6b4-3afc1f5ee527-kube-api-access-2p4sd\") pod \"openstackclient\" (UID: \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.711400 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xld87\" (UniqueName: \"kubernetes.io/projected/f6c8b569-3102-4145-94db-b8150854dbc9-kube-api-access-xld87\") pod \"openstackclient\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.711455 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config\") pod \"openstackclient\" (UID: \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.712231 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config\") pod \"openstackclient\" (UID: \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: E1006 16:19:04.715043 4763 projected.go:194] Error preparing data for projected volume kube-api-access-2p4sd for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (537678d2-6b4a-4485-a6b4-3afc1f5ee527) does not match the UID in record. The object might have been deleted and then recreated Oct 06 16:19:04 crc kubenswrapper[4763]: E1006 16:19:04.715102 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/537678d2-6b4a-4485-a6b4-3afc1f5ee527-kube-api-access-2p4sd podName:537678d2-6b4a-4485-a6b4-3afc1f5ee527 nodeName:}" failed. No retries permitted until 2025-10-06 16:19:05.215087144 +0000 UTC m=+5142.370379656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2p4sd" (UniqueName: "kubernetes.io/projected/537678d2-6b4a-4485-a6b4-3afc1f5ee527-kube-api-access-2p4sd") pod "openstackclient" (UID: "537678d2-6b4a-4485-a6b4-3afc1f5ee527") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (537678d2-6b4a-4485-a6b4-3afc1f5ee527) does not match the UID in record. The object might have been deleted and then recreated Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.730826 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config-secret\") pod \"openstackclient\" (UID: \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.813659 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config\") pod \"openstackclient\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.813789 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config-secret\") pod \"openstackclient\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.813962 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xld87\" (UniqueName: \"kubernetes.io/projected/f6c8b569-3102-4145-94db-b8150854dbc9-kube-api-access-xld87\") pod \"openstackclient\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.818407 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config\") pod \"openstackclient\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.828101 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config-secret\") pod \"openstackclient\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.833256 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xld87\" (UniqueName: \"kubernetes.io/projected/f6c8b569-3102-4145-94db-b8150854dbc9-kube-api-access-xld87\") pod \"openstackclient\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.915161 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.947989 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 16:19:04 crc kubenswrapper[4763]: I1006 16:19:04.952352 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="537678d2-6b4a-4485-a6b4-3afc1f5ee527" podUID="f6c8b569-3102-4145-94db-b8150854dbc9" Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.013990 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.119907 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config-secret\") pod \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\" (UID: \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\") " Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.120248 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config\") pod \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\" (UID: \"537678d2-6b4a-4485-a6b4-3afc1f5ee527\") " Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.120956 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p4sd\" (UniqueName: \"kubernetes.io/projected/537678d2-6b4a-4485-a6b4-3afc1f5ee527-kube-api-access-2p4sd\") on node \"crc\" DevicePath \"\"" Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.121684 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "537678d2-6b4a-4485-a6b4-3afc1f5ee527" (UID: "537678d2-6b4a-4485-a6b4-3afc1f5ee527"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.130378 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "537678d2-6b4a-4485-a6b4-3afc1f5ee527" (UID: "537678d2-6b4a-4485-a6b4-3afc1f5ee527"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.222535 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.222566 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/537678d2-6b4a-4485-a6b4-3afc1f5ee527-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.370976 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.586563 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="537678d2-6b4a-4485-a6b4-3afc1f5ee527" path="/var/lib/kubelet/pods/537678d2-6b4a-4485-a6b4-3afc1f5ee527/volumes" Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.960512 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.960591 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f6c8b569-3102-4145-94db-b8150854dbc9","Type":"ContainerStarted","Data":"a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad"} Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.960685 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f6c8b569-3102-4145-94db-b8150854dbc9","Type":"ContainerStarted","Data":"be178ebc29c5dbf627942ffa4ec6113d857669ee544cddade4d60e5ae4663a61"} Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.985626 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="537678d2-6b4a-4485-a6b4-3afc1f5ee527" podUID="f6c8b569-3102-4145-94db-b8150854dbc9" Oct 06 16:19:05 crc kubenswrapper[4763]: I1006 16:19:05.986791 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.986779906 podStartE2EDuration="1.986779906s" podCreationTimestamp="2025-10-06 16:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:19:05.982017708 +0000 UTC m=+5143.137310230" watchObservedRunningTime="2025-10-06 16:19:05.986779906 +0000 UTC m=+5143.142072418" Oct 06 16:19:29 crc kubenswrapper[4763]: I1006 16:19:29.418880 4763 scope.go:117] "RemoveContainer" containerID="ccd862a1b0624e0f932212c354f37fb049363a038d98682928750992554b2c70" Oct 06 16:19:29 crc kubenswrapper[4763]: I1006 16:19:29.447680 4763 scope.go:117] "RemoveContainer" containerID="48349448639c3511b91d16da3e001d9735b77c7e4ec6cb1251776ab018e38955" Oct 06 16:20:33 crc kubenswrapper[4763]: I1006 16:20:33.876318 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:20:33 crc kubenswrapper[4763]: I1006 16:20:33.876837 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:20:49 crc kubenswrapper[4763]: I1006 16:20:49.845245 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-zqmcj"] Oct 06 16:20:49 crc kubenswrapper[4763]: I1006 16:20:49.848105 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zqmcj" Oct 06 16:20:49 crc kubenswrapper[4763]: I1006 16:20:49.851837 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zqmcj"] Oct 06 16:20:49 crc kubenswrapper[4763]: I1006 16:20:49.964020 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c4fr\" (UniqueName: \"kubernetes.io/projected/08632e8e-5a58-413d-a3ba-c9454181600d-kube-api-access-5c4fr\") pod \"barbican-db-create-zqmcj\" (UID: \"08632e8e-5a58-413d-a3ba-c9454181600d\") " pod="openstack/barbican-db-create-zqmcj" Oct 06 16:20:50 crc kubenswrapper[4763]: I1006 16:20:50.065860 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c4fr\" (UniqueName: \"kubernetes.io/projected/08632e8e-5a58-413d-a3ba-c9454181600d-kube-api-access-5c4fr\") pod \"barbican-db-create-zqmcj\" (UID: \"08632e8e-5a58-413d-a3ba-c9454181600d\") " pod="openstack/barbican-db-create-zqmcj" Oct 06 16:20:50 crc kubenswrapper[4763]: I1006 16:20:50.100169 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c4fr\" (UniqueName: \"kubernetes.io/projected/08632e8e-5a58-413d-a3ba-c9454181600d-kube-api-access-5c4fr\") pod \"barbican-db-create-zqmcj\" (UID: \"08632e8e-5a58-413d-a3ba-c9454181600d\") " pod="openstack/barbican-db-create-zqmcj" Oct 06 16:20:50 crc kubenswrapper[4763]: I1006 16:20:50.169474 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zqmcj" Oct 06 16:20:50 crc kubenswrapper[4763]: I1006 16:20:50.694223 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zqmcj"] Oct 06 16:20:50 crc kubenswrapper[4763]: W1006 16:20:50.699103 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08632e8e_5a58_413d_a3ba_c9454181600d.slice/crio-e4711388c0f46563351a74802afd450c2ff0bd3ec6676912748d2edc985b5243 WatchSource:0}: Error finding container e4711388c0f46563351a74802afd450c2ff0bd3ec6676912748d2edc985b5243: Status 404 returned error can't find the container with id e4711388c0f46563351a74802afd450c2ff0bd3ec6676912748d2edc985b5243 Oct 06 16:20:51 crc kubenswrapper[4763]: I1006 16:20:51.000503 4763 generic.go:334] "Generic (PLEG): container finished" podID="08632e8e-5a58-413d-a3ba-c9454181600d" containerID="a133221f5356490e85eccdbf7b8fa1e87c7929a9574b1f6e919ee4a000439dcc" exitCode=0 Oct 06 16:20:51 crc kubenswrapper[4763]: I1006 16:20:51.000579 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zqmcj" event={"ID":"08632e8e-5a58-413d-a3ba-c9454181600d","Type":"ContainerDied","Data":"a133221f5356490e85eccdbf7b8fa1e87c7929a9574b1f6e919ee4a000439dcc"} Oct 06 16:20:51 crc kubenswrapper[4763]: I1006 16:20:51.000707 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zqmcj" event={"ID":"08632e8e-5a58-413d-a3ba-c9454181600d","Type":"ContainerStarted","Data":"e4711388c0f46563351a74802afd450c2ff0bd3ec6676912748d2edc985b5243"} Oct 06 16:20:52 crc kubenswrapper[4763]: I1006 16:20:52.326612 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zqmcj" Oct 06 16:20:52 crc kubenswrapper[4763]: I1006 16:20:52.514340 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c4fr\" (UniqueName: \"kubernetes.io/projected/08632e8e-5a58-413d-a3ba-c9454181600d-kube-api-access-5c4fr\") pod \"08632e8e-5a58-413d-a3ba-c9454181600d\" (UID: \"08632e8e-5a58-413d-a3ba-c9454181600d\") " Oct 06 16:20:52 crc kubenswrapper[4763]: I1006 16:20:52.524079 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08632e8e-5a58-413d-a3ba-c9454181600d-kube-api-access-5c4fr" (OuterVolumeSpecName: "kube-api-access-5c4fr") pod "08632e8e-5a58-413d-a3ba-c9454181600d" (UID: "08632e8e-5a58-413d-a3ba-c9454181600d"). InnerVolumeSpecName "kube-api-access-5c4fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:20:52 crc kubenswrapper[4763]: I1006 16:20:52.616787 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c4fr\" (UniqueName: \"kubernetes.io/projected/08632e8e-5a58-413d-a3ba-c9454181600d-kube-api-access-5c4fr\") on node \"crc\" DevicePath \"\"" Oct 06 16:20:53 crc kubenswrapper[4763]: I1006 16:20:53.021040 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zqmcj" event={"ID":"08632e8e-5a58-413d-a3ba-c9454181600d","Type":"ContainerDied","Data":"e4711388c0f46563351a74802afd450c2ff0bd3ec6676912748d2edc985b5243"} Oct 06 16:20:53 crc kubenswrapper[4763]: I1006 16:20:53.021107 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zqmcj" Oct 06 16:20:53 crc kubenswrapper[4763]: I1006 16:20:53.021113 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4711388c0f46563351a74802afd450c2ff0bd3ec6676912748d2edc985b5243" Oct 06 16:20:59 crc kubenswrapper[4763]: I1006 16:20:59.851057 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6537-account-create-f2w6n"] Oct 06 16:20:59 crc kubenswrapper[4763]: E1006 16:20:59.852191 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08632e8e-5a58-413d-a3ba-c9454181600d" containerName="mariadb-database-create" Oct 06 16:20:59 crc kubenswrapper[4763]: I1006 16:20:59.852215 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08632e8e-5a58-413d-a3ba-c9454181600d" containerName="mariadb-database-create" Oct 06 16:20:59 crc kubenswrapper[4763]: I1006 16:20:59.852479 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="08632e8e-5a58-413d-a3ba-c9454181600d" containerName="mariadb-database-create" Oct 06 16:20:59 crc kubenswrapper[4763]: I1006 16:20:59.853421 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6537-account-create-f2w6n" Oct 06 16:20:59 crc kubenswrapper[4763]: I1006 16:20:59.856510 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvgn9\" (UniqueName: \"kubernetes.io/projected/85fab8df-dad8-437d-98cb-423007ec4737-kube-api-access-xvgn9\") pod \"barbican-6537-account-create-f2w6n\" (UID: \"85fab8df-dad8-437d-98cb-423007ec4737\") " pod="openstack/barbican-6537-account-create-f2w6n" Oct 06 16:20:59 crc kubenswrapper[4763]: I1006 16:20:59.860529 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 06 16:20:59 crc kubenswrapper[4763]: I1006 16:20:59.867660 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6537-account-create-f2w6n"] Oct 06 16:20:59 crc kubenswrapper[4763]: I1006 16:20:59.958160 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvgn9\" (UniqueName: \"kubernetes.io/projected/85fab8df-dad8-437d-98cb-423007ec4737-kube-api-access-xvgn9\") pod \"barbican-6537-account-create-f2w6n\" (UID: \"85fab8df-dad8-437d-98cb-423007ec4737\") " pod="openstack/barbican-6537-account-create-f2w6n" Oct 06 16:20:59 crc kubenswrapper[4763]: I1006 16:20:59.978505 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvgn9\" (UniqueName: \"kubernetes.io/projected/85fab8df-dad8-437d-98cb-423007ec4737-kube-api-access-xvgn9\") pod \"barbican-6537-account-create-f2w6n\" (UID: \"85fab8df-dad8-437d-98cb-423007ec4737\") " pod="openstack/barbican-6537-account-create-f2w6n" Oct 06 16:21:00 crc kubenswrapper[4763]: I1006 16:21:00.214372 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6537-account-create-f2w6n" Oct 06 16:21:00 crc kubenswrapper[4763]: I1006 16:21:00.725983 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6537-account-create-f2w6n"] Oct 06 16:21:01 crc kubenswrapper[4763]: I1006 16:21:01.096829 4763 generic.go:334] "Generic (PLEG): container finished" podID="85fab8df-dad8-437d-98cb-423007ec4737" containerID="ca536d3678c9897d02e0297702fce2291ee82ac35c6ebace1c9bdb771901fb3d" exitCode=0 Oct 06 16:21:01 crc kubenswrapper[4763]: I1006 16:21:01.096872 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6537-account-create-f2w6n" event={"ID":"85fab8df-dad8-437d-98cb-423007ec4737","Type":"ContainerDied","Data":"ca536d3678c9897d02e0297702fce2291ee82ac35c6ebace1c9bdb771901fb3d"} Oct 06 16:21:01 crc kubenswrapper[4763]: I1006 16:21:01.096894 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6537-account-create-f2w6n" event={"ID":"85fab8df-dad8-437d-98cb-423007ec4737","Type":"ContainerStarted","Data":"25dd92980087ea4e93ade3472c655a3638f604d7644f77828862e9c8d2cb8d4d"} Oct 06 16:21:02 crc kubenswrapper[4763]: I1006 16:21:02.413182 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6537-account-create-f2w6n" Oct 06 16:21:02 crc kubenswrapper[4763]: I1006 16:21:02.504411 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvgn9\" (UniqueName: \"kubernetes.io/projected/85fab8df-dad8-437d-98cb-423007ec4737-kube-api-access-xvgn9\") pod \"85fab8df-dad8-437d-98cb-423007ec4737\" (UID: \"85fab8df-dad8-437d-98cb-423007ec4737\") " Oct 06 16:21:02 crc kubenswrapper[4763]: I1006 16:21:02.509907 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85fab8df-dad8-437d-98cb-423007ec4737-kube-api-access-xvgn9" (OuterVolumeSpecName: "kube-api-access-xvgn9") pod "85fab8df-dad8-437d-98cb-423007ec4737" (UID: "85fab8df-dad8-437d-98cb-423007ec4737"). InnerVolumeSpecName "kube-api-access-xvgn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:21:02 crc kubenswrapper[4763]: I1006 16:21:02.605921 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvgn9\" (UniqueName: \"kubernetes.io/projected/85fab8df-dad8-437d-98cb-423007ec4737-kube-api-access-xvgn9\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:03 crc kubenswrapper[4763]: I1006 16:21:03.111873 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6537-account-create-f2w6n" event={"ID":"85fab8df-dad8-437d-98cb-423007ec4737","Type":"ContainerDied","Data":"25dd92980087ea4e93ade3472c655a3638f604d7644f77828862e9c8d2cb8d4d"} Oct 06 16:21:03 crc kubenswrapper[4763]: I1006 16:21:03.112268 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25dd92980087ea4e93ade3472c655a3638f604d7644f77828862e9c8d2cb8d4d" Oct 06 16:21:03 crc kubenswrapper[4763]: I1006 16:21:03.111938 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6537-account-create-f2w6n" Oct 06 16:21:03 crc kubenswrapper[4763]: I1006 16:21:03.877326 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:21:03 crc kubenswrapper[4763]: I1006 16:21:03.877464 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.123291 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ng4zk"] Oct 06 16:21:05 crc kubenswrapper[4763]: E1006 16:21:05.123737 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85fab8df-dad8-437d-98cb-423007ec4737" containerName="mariadb-account-create" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.123754 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="85fab8df-dad8-437d-98cb-423007ec4737" containerName="mariadb-account-create" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.123973 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="85fab8df-dad8-437d-98cb-423007ec4737" containerName="mariadb-account-create" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.124690 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.128084 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ngxv8" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.128500 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.131571 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ng4zk"] Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.255268 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-combined-ca-bundle\") pod \"barbican-db-sync-ng4zk\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.255323 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj9tw\" (UniqueName: \"kubernetes.io/projected/57c8a64f-69a3-41ef-9707-70aeede40d32-kube-api-access-xj9tw\") pod \"barbican-db-sync-ng4zk\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.255459 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-db-sync-config-data\") pod \"barbican-db-sync-ng4zk\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.356846 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-combined-ca-bundle\") pod \"barbican-db-sync-ng4zk\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.356889 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj9tw\" (UniqueName: \"kubernetes.io/projected/57c8a64f-69a3-41ef-9707-70aeede40d32-kube-api-access-xj9tw\") pod \"barbican-db-sync-ng4zk\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.356990 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-db-sync-config-data\") pod \"barbican-db-sync-ng4zk\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.362491 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-combined-ca-bundle\") pod \"barbican-db-sync-ng4zk\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.364544 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-db-sync-config-data\") pod \"barbican-db-sync-ng4zk\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.393290 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj9tw\" (UniqueName: \"kubernetes.io/projected/57c8a64f-69a3-41ef-9707-70aeede40d32-kube-api-access-xj9tw\") pod \"barbican-db-sync-ng4zk\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.460165 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:05 crc kubenswrapper[4763]: I1006 16:21:05.911753 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ng4zk"] Oct 06 16:21:06 crc kubenswrapper[4763]: I1006 16:21:06.149866 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ng4zk" event={"ID":"57c8a64f-69a3-41ef-9707-70aeede40d32","Type":"ContainerStarted","Data":"d7d8bcf1a880e9e944d29c9ecca87acb33ee02388f87ffefec835e182dfbd073"} Oct 06 16:21:06 crc kubenswrapper[4763]: I1006 16:21:06.150871 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ng4zk" event={"ID":"57c8a64f-69a3-41ef-9707-70aeede40d32","Type":"ContainerStarted","Data":"a1a7351f8b64d66f4a9c05d3357ea0285166c00dd20f96cdfede109005919adf"} Oct 06 16:21:06 crc kubenswrapper[4763]: I1006 16:21:06.176019 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ng4zk" podStartSLOduration=1.176001404 podStartE2EDuration="1.176001404s" podCreationTimestamp="2025-10-06 16:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:21:06.16988174 +0000 UTC m=+5263.325174252" watchObservedRunningTime="2025-10-06 16:21:06.176001404 +0000 UTC m=+5263.331293916" Oct 06 16:21:08 crc kubenswrapper[4763]: I1006 16:21:08.174132 4763 generic.go:334] "Generic (PLEG): container finished" podID="57c8a64f-69a3-41ef-9707-70aeede40d32" containerID="d7d8bcf1a880e9e944d29c9ecca87acb33ee02388f87ffefec835e182dfbd073" exitCode=0 Oct 06 16:21:08 crc kubenswrapper[4763]: I1006 16:21:08.174200 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ng4zk" event={"ID":"57c8a64f-69a3-41ef-9707-70aeede40d32","Type":"ContainerDied","Data":"d7d8bcf1a880e9e944d29c9ecca87acb33ee02388f87ffefec835e182dfbd073"} Oct 06 16:21:09 crc kubenswrapper[4763]: I1006 16:21:09.535395 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:09 crc kubenswrapper[4763]: I1006 16:21:09.630922 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-db-sync-config-data\") pod \"57c8a64f-69a3-41ef-9707-70aeede40d32\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " Oct 06 16:21:09 crc kubenswrapper[4763]: I1006 16:21:09.631030 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj9tw\" (UniqueName: \"kubernetes.io/projected/57c8a64f-69a3-41ef-9707-70aeede40d32-kube-api-access-xj9tw\") pod \"57c8a64f-69a3-41ef-9707-70aeede40d32\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " Oct 06 16:21:09 crc kubenswrapper[4763]: I1006 16:21:09.631068 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-combined-ca-bundle\") pod \"57c8a64f-69a3-41ef-9707-70aeede40d32\" (UID: \"57c8a64f-69a3-41ef-9707-70aeede40d32\") " Oct 06 16:21:09 crc kubenswrapper[4763]: I1006 16:21:09.638937 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c8a64f-69a3-41ef-9707-70aeede40d32-kube-api-access-xj9tw" (OuterVolumeSpecName: "kube-api-access-xj9tw") pod "57c8a64f-69a3-41ef-9707-70aeede40d32" (UID: "57c8a64f-69a3-41ef-9707-70aeede40d32"). InnerVolumeSpecName "kube-api-access-xj9tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:21:09 crc kubenswrapper[4763]: I1006 16:21:09.638961 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "57c8a64f-69a3-41ef-9707-70aeede40d32" (UID: "57c8a64f-69a3-41ef-9707-70aeede40d32"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:21:09 crc kubenswrapper[4763]: I1006 16:21:09.668183 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57c8a64f-69a3-41ef-9707-70aeede40d32" (UID: "57c8a64f-69a3-41ef-9707-70aeede40d32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:21:09 crc kubenswrapper[4763]: I1006 16:21:09.733876 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:09 crc kubenswrapper[4763]: I1006 16:21:09.733915 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj9tw\" (UniqueName: \"kubernetes.io/projected/57c8a64f-69a3-41ef-9707-70aeede40d32-kube-api-access-xj9tw\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:09 crc kubenswrapper[4763]: I1006 16:21:09.733928 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c8a64f-69a3-41ef-9707-70aeede40d32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.197115 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ng4zk" event={"ID":"57c8a64f-69a3-41ef-9707-70aeede40d32","Type":"ContainerDied","Data":"a1a7351f8b64d66f4a9c05d3357ea0285166c00dd20f96cdfede109005919adf"} Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.197177 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1a7351f8b64d66f4a9c05d3357ea0285166c00dd20f96cdfede109005919adf" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.197205 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ng4zk" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.461663 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-77fcfc7bd6-g8998"] Oct 06 16:21:10 crc kubenswrapper[4763]: E1006 16:21:10.462131 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c8a64f-69a3-41ef-9707-70aeede40d32" containerName="barbican-db-sync" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.462155 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c8a64f-69a3-41ef-9707-70aeede40d32" containerName="barbican-db-sync" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.462388 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="57c8a64f-69a3-41ef-9707-70aeede40d32" containerName="barbican-db-sync" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.463499 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.465349 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.467124 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.467428 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ngxv8" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.470849 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-555d589d8f-2chkm"] Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.472572 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.474240 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.483674 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-555d589d8f-2chkm"] Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.500782 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-77fcfc7bd6-g8998"] Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.554531 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-9skjc"] Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.557047 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.580427 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-9skjc"] Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.639106 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7f584d8ff6-z7w94"] Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.645959 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.649655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0747a11c-cd13-464e-a61d-4dd28334ba1d-config-data\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.649721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8w8t\" (UniqueName: \"kubernetes.io/projected/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-kube-api-access-h8w8t\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.649772 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-config-data\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.649866 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-combined-ca-bundle\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.649892 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-logs\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.649925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0747a11c-cd13-464e-a61d-4dd28334ba1d-config-data-custom\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.650248 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.650607 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0747a11c-cd13-464e-a61d-4dd28334ba1d-logs\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.650661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0747a11c-cd13-464e-a61d-4dd28334ba1d-combined-ca-bundle\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.650681 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-config-data-custom\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.650712 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwdzl\" (UniqueName: \"kubernetes.io/projected/0747a11c-cd13-464e-a61d-4dd28334ba1d-kube-api-access-rwdzl\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.650992 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f584d8ff6-z7w94"] Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752382 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwdzl\" (UniqueName: \"kubernetes.io/projected/0747a11c-cd13-464e-a61d-4dd28334ba1d-kube-api-access-rwdzl\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752439 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm2z2\" (UniqueName: \"kubernetes.io/projected/d86151f8-f0a4-4b49-ad2d-a33db299831a-kube-api-access-nm2z2\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752473 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-config\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752496 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbb8d436-52d9-43fa-aa95-550b13d26658-config-data\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752542 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0747a11c-cd13-464e-a61d-4dd28334ba1d-config-data\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752580 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8w8t\" (UniqueName: \"kubernetes.io/projected/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-kube-api-access-h8w8t\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752646 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-config-data\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752677 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-sb\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752703 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-nb\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752730 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-combined-ca-bundle\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752755 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-logs\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752787 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0747a11c-cd13-464e-a61d-4dd28334ba1d-config-data-custom\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752821 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbb8d436-52d9-43fa-aa95-550b13d26658-logs\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752844 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbb8d436-52d9-43fa-aa95-550b13d26658-config-data-custom\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752890 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-dns-svc\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752919 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbb8d436-52d9-43fa-aa95-550b13d26658-combined-ca-bundle\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752967 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82spz\" (UniqueName: \"kubernetes.io/projected/cbb8d436-52d9-43fa-aa95-550b13d26658-kube-api-access-82spz\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.752996 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0747a11c-cd13-464e-a61d-4dd28334ba1d-logs\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.753018 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0747a11c-cd13-464e-a61d-4dd28334ba1d-combined-ca-bundle\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.753041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-config-data-custom\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.754377 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-logs\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.754990 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0747a11c-cd13-464e-a61d-4dd28334ba1d-logs\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.759308 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0747a11c-cd13-464e-a61d-4dd28334ba1d-config-data\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.760700 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-combined-ca-bundle\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.767235 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-config-data\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.773478 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0747a11c-cd13-464e-a61d-4dd28334ba1d-config-data-custom\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.773527 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0747a11c-cd13-464e-a61d-4dd28334ba1d-combined-ca-bundle\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.774088 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-config-data-custom\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.774274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8w8t\" (UniqueName: \"kubernetes.io/projected/487c6f7a-74f9-4132-b3a9-5dcba1bcad30-kube-api-access-h8w8t\") pod \"barbican-keystone-listener-77fcfc7bd6-g8998\" (UID: \"487c6f7a-74f9-4132-b3a9-5dcba1bcad30\") " pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.777523 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwdzl\" (UniqueName: \"kubernetes.io/projected/0747a11c-cd13-464e-a61d-4dd28334ba1d-kube-api-access-rwdzl\") pod \"barbican-worker-555d589d8f-2chkm\" (UID: \"0747a11c-cd13-464e-a61d-4dd28334ba1d\") " pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.789684 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.800535 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-555d589d8f-2chkm" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.856558 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbb8d436-52d9-43fa-aa95-550b13d26658-combined-ca-bundle\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.856625 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82spz\" (UniqueName: \"kubernetes.io/projected/cbb8d436-52d9-43fa-aa95-550b13d26658-kube-api-access-82spz\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.856675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm2z2\" (UniqueName: \"kubernetes.io/projected/d86151f8-f0a4-4b49-ad2d-a33db299831a-kube-api-access-nm2z2\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.856699 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-config\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.856718 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbb8d436-52d9-43fa-aa95-550b13d26658-config-data\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.856811 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-sb\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.856834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-nb\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.856890 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbb8d436-52d9-43fa-aa95-550b13d26658-logs\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.856919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbb8d436-52d9-43fa-aa95-550b13d26658-config-data-custom\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.856980 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-dns-svc\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.857970 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-dns-svc\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.858492 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-nb\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.860049 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-sb\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.860366 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbb8d436-52d9-43fa-aa95-550b13d26658-logs\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.862125 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbb8d436-52d9-43fa-aa95-550b13d26658-config-data-custom\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.862427 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbb8d436-52d9-43fa-aa95-550b13d26658-combined-ca-bundle\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.862894 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-config\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.863811 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbb8d436-52d9-43fa-aa95-550b13d26658-config-data\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.879845 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm2z2\" (UniqueName: \"kubernetes.io/projected/d86151f8-f0a4-4b49-ad2d-a33db299831a-kube-api-access-nm2z2\") pod \"dnsmasq-dns-869545f9c9-9skjc\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.881690 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82spz\" (UniqueName: \"kubernetes.io/projected/cbb8d436-52d9-43fa-aa95-550b13d26658-kube-api-access-82spz\") pod \"barbican-api-7f584d8ff6-z7w94\" (UID: \"cbb8d436-52d9-43fa-aa95-550b13d26658\") " pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.885220 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:10 crc kubenswrapper[4763]: I1006 16:21:10.987522 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:11 crc kubenswrapper[4763]: I1006 16:21:11.312993 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f584d8ff6-z7w94"] Oct 06 16:21:11 crc kubenswrapper[4763]: I1006 16:21:11.368408 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-77fcfc7bd6-g8998"] Oct 06 16:21:11 crc kubenswrapper[4763]: I1006 16:21:11.423369 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-555d589d8f-2chkm"] Oct 06 16:21:11 crc kubenswrapper[4763]: I1006 16:21:11.433426 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-9skjc"] Oct 06 16:21:11 crc kubenswrapper[4763]: W1006 16:21:11.441201 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0747a11c_cd13_464e_a61d_4dd28334ba1d.slice/crio-c8ec2702d6bad658c4abee5f57c4dbfdba6007ccba7a863fd2571f7c6afa9663 WatchSource:0}: Error finding container c8ec2702d6bad658c4abee5f57c4dbfdba6007ccba7a863fd2571f7c6afa9663: Status 404 returned error can't find the container with id c8ec2702d6bad658c4abee5f57c4dbfdba6007ccba7a863fd2571f7c6afa9663 Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.217420 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" event={"ID":"487c6f7a-74f9-4132-b3a9-5dcba1bcad30","Type":"ContainerStarted","Data":"d980e059ebecbd8c75a0cd5e766622546983dfdb292b6ec9bfa22612d6ed1013"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.217760 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" event={"ID":"487c6f7a-74f9-4132-b3a9-5dcba1bcad30","Type":"ContainerStarted","Data":"b6e7282eaf0d78aaf4cd83adbd0f778940b2aae2120b939961c0033e1aa4e8b1"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.217771 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" event={"ID":"487c6f7a-74f9-4132-b3a9-5dcba1bcad30","Type":"ContainerStarted","Data":"3256ecba7669fbf2459fdfb875bb670c4f49e2b624e3bc5858d1aad22ab6d955"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.220780 4763 generic.go:334] "Generic (PLEG): container finished" podID="d86151f8-f0a4-4b49-ad2d-a33db299831a" containerID="1a668babf2a6aab7be3658fd6e681a2ac42e79c807e4d91b9ffcaf3fe0049bc4" exitCode=0 Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.220857 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" event={"ID":"d86151f8-f0a4-4b49-ad2d-a33db299831a","Type":"ContainerDied","Data":"1a668babf2a6aab7be3658fd6e681a2ac42e79c807e4d91b9ffcaf3fe0049bc4"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.220882 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" event={"ID":"d86151f8-f0a4-4b49-ad2d-a33db299831a","Type":"ContainerStarted","Data":"545f42b2ba67dfb9b6a1d1727c169212796a332b3d3da9b73617d3312e151d40"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.226390 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f584d8ff6-z7w94" event={"ID":"cbb8d436-52d9-43fa-aa95-550b13d26658","Type":"ContainerStarted","Data":"22b770cc0babb99fdb106708ca7e305d4f34bb163e6162f3d77842e7891ff4cb"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.226435 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f584d8ff6-z7w94" event={"ID":"cbb8d436-52d9-43fa-aa95-550b13d26658","Type":"ContainerStarted","Data":"95190cafddca58ec1bf60a895cb7a4c8f67c319e9dd40cbe01914651fb40fd87"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.226445 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f584d8ff6-z7w94" event={"ID":"cbb8d436-52d9-43fa-aa95-550b13d26658","Type":"ContainerStarted","Data":"ef0adc6b2b94af9eda6f6114b713e6f12be03fe60359fbdbcb415c575f75e7a5"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.226558 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.226584 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.228845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-555d589d8f-2chkm" event={"ID":"0747a11c-cd13-464e-a61d-4dd28334ba1d","Type":"ContainerStarted","Data":"fb9726a07c387d9775f3eb73b537c32910f297b4dfb78cf5ae6c54a7a1b2029e"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.228879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-555d589d8f-2chkm" event={"ID":"0747a11c-cd13-464e-a61d-4dd28334ba1d","Type":"ContainerStarted","Data":"9a9ecfb8dbe241e44f6fb8f4d15c5ab763189134ee369bb9e230471dcdd66af5"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.228891 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-555d589d8f-2chkm" event={"ID":"0747a11c-cd13-464e-a61d-4dd28334ba1d","Type":"ContainerStarted","Data":"c8ec2702d6bad658c4abee5f57c4dbfdba6007ccba7a863fd2571f7c6afa9663"} Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.246188 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-77fcfc7bd6-g8998" podStartSLOduration=2.246165769 podStartE2EDuration="2.246165769s" podCreationTimestamp="2025-10-06 16:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:21:12.236049808 +0000 UTC m=+5269.391342330" watchObservedRunningTime="2025-10-06 16:21:12.246165769 +0000 UTC m=+5269.401458281" Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.261376 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-555d589d8f-2chkm" podStartSLOduration=2.261354276 podStartE2EDuration="2.261354276s" podCreationTimestamp="2025-10-06 16:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:21:12.255044267 +0000 UTC m=+5269.410336779" watchObservedRunningTime="2025-10-06 16:21:12.261354276 +0000 UTC m=+5269.416646788" Oct 06 16:21:12 crc kubenswrapper[4763]: I1006 16:21:12.279093 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7f584d8ff6-z7w94" podStartSLOduration=2.279072791 podStartE2EDuration="2.279072791s" podCreationTimestamp="2025-10-06 16:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:21:12.277957111 +0000 UTC m=+5269.433249633" watchObservedRunningTime="2025-10-06 16:21:12.279072791 +0000 UTC m=+5269.434365303" Oct 06 16:21:13 crc kubenswrapper[4763]: I1006 16:21:13.241149 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" event={"ID":"d86151f8-f0a4-4b49-ad2d-a33db299831a","Type":"ContainerStarted","Data":"c3cb9211955b463705a862764ae1f834f11bcbdb04c993625b8114b6ecb4d340"} Oct 06 16:21:13 crc kubenswrapper[4763]: I1006 16:21:13.241345 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:13 crc kubenswrapper[4763]: I1006 16:21:13.264573 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" podStartSLOduration=3.264556895 podStartE2EDuration="3.264556895s" podCreationTimestamp="2025-10-06 16:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:21:13.259091979 +0000 UTC m=+5270.414384531" watchObservedRunningTime="2025-10-06 16:21:13.264556895 +0000 UTC m=+5270.419849407" Oct 06 16:21:20 crc kubenswrapper[4763]: I1006 16:21:20.888166 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:21:20 crc kubenswrapper[4763]: I1006 16:21:20.968909 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-vc6dc"] Oct 06 16:21:20 crc kubenswrapper[4763]: I1006 16:21:20.969189 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" podUID="abbbc32d-99b3-4a65-951b-11aa0daaefd5" containerName="dnsmasq-dns" containerID="cri-o://82cd5a691147581f75d57d268c2b94f08ff9258b116c1085e83508ddfac252a3" gracePeriod=10 Oct 06 16:21:20 crc kubenswrapper[4763]: I1006 16:21:20.987665 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" podUID="abbbc32d-99b3-4a65-951b-11aa0daaefd5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.20:5353: connect: connection refused" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.324793 4763 generic.go:334] "Generic (PLEG): container finished" podID="abbbc32d-99b3-4a65-951b-11aa0daaefd5" containerID="82cd5a691147581f75d57d268c2b94f08ff9258b116c1085e83508ddfac252a3" exitCode=0 Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.324979 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" event={"ID":"abbbc32d-99b3-4a65-951b-11aa0daaefd5","Type":"ContainerDied","Data":"82cd5a691147581f75d57d268c2b94f08ff9258b116c1085e83508ddfac252a3"} Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.519741 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.687898 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-sb\") pod \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.687956 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmv4b\" (UniqueName: \"kubernetes.io/projected/abbbc32d-99b3-4a65-951b-11aa0daaefd5-kube-api-access-vmv4b\") pod \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.688054 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-nb\") pod \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.688089 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-config\") pod \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.688130 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-dns-svc\") pod \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\" (UID: \"abbbc32d-99b3-4a65-951b-11aa0daaefd5\") " Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.695965 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abbbc32d-99b3-4a65-951b-11aa0daaefd5-kube-api-access-vmv4b" (OuterVolumeSpecName: "kube-api-access-vmv4b") pod "abbbc32d-99b3-4a65-951b-11aa0daaefd5" (UID: "abbbc32d-99b3-4a65-951b-11aa0daaefd5"). InnerVolumeSpecName "kube-api-access-vmv4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.725148 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "abbbc32d-99b3-4a65-951b-11aa0daaefd5" (UID: "abbbc32d-99b3-4a65-951b-11aa0daaefd5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.745604 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "abbbc32d-99b3-4a65-951b-11aa0daaefd5" (UID: "abbbc32d-99b3-4a65-951b-11aa0daaefd5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.748438 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "abbbc32d-99b3-4a65-951b-11aa0daaefd5" (UID: "abbbc32d-99b3-4a65-951b-11aa0daaefd5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.749782 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-config" (OuterVolumeSpecName: "config") pod "abbbc32d-99b3-4a65-951b-11aa0daaefd5" (UID: "abbbc32d-99b3-4a65-951b-11aa0daaefd5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.789964 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.790008 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.790018 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.790031 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmv4b\" (UniqueName: \"kubernetes.io/projected/abbbc32d-99b3-4a65-951b-11aa0daaefd5-kube-api-access-vmv4b\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:21 crc kubenswrapper[4763]: I1006 16:21:21.790040 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abbbc32d-99b3-4a65-951b-11aa0daaefd5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:22 crc kubenswrapper[4763]: I1006 16:21:22.334160 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" event={"ID":"abbbc32d-99b3-4a65-951b-11aa0daaefd5","Type":"ContainerDied","Data":"a1b390e8850dd91cc3d2cc3754fc9ec10b4e7951f2a3e7b0f97cf6294fce6955"} Oct 06 16:21:22 crc kubenswrapper[4763]: I1006 16:21:22.334235 4763 scope.go:117] "RemoveContainer" containerID="82cd5a691147581f75d57d268c2b94f08ff9258b116c1085e83508ddfac252a3" Oct 06 16:21:22 crc kubenswrapper[4763]: I1006 16:21:22.334243 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7485969d9c-vc6dc" Oct 06 16:21:22 crc kubenswrapper[4763]: I1006 16:21:22.373141 4763 scope.go:117] "RemoveContainer" containerID="f3a66e8def67ce19bd7428ccd312d6e003ed9a7f9ae0e85fc58c0ea42e6b2bcb" Oct 06 16:21:22 crc kubenswrapper[4763]: I1006 16:21:22.383957 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-vc6dc"] Oct 06 16:21:22 crc kubenswrapper[4763]: I1006 16:21:22.392140 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-vc6dc"] Oct 06 16:21:22 crc kubenswrapper[4763]: I1006 16:21:22.534366 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:22 crc kubenswrapper[4763]: I1006 16:21:22.671775 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f584d8ff6-z7w94" Oct 06 16:21:23 crc kubenswrapper[4763]: I1006 16:21:23.589263 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abbbc32d-99b3-4a65-951b-11aa0daaefd5" path="/var/lib/kubelet/pods/abbbc32d-99b3-4a65-951b-11aa0daaefd5/volumes" Oct 06 16:21:33 crc kubenswrapper[4763]: I1006 16:21:33.876966 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:21:33 crc kubenswrapper[4763]: I1006 16:21:33.878815 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:21:33 crc kubenswrapper[4763]: I1006 16:21:33.878961 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 16:21:33 crc kubenswrapper[4763]: I1006 16:21:33.879698 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 16:21:33 crc kubenswrapper[4763]: I1006 16:21:33.879886 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" gracePeriod=600 Oct 06 16:21:34 crc kubenswrapper[4763]: E1006 16:21:34.013544 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:21:34 crc kubenswrapper[4763]: I1006 16:21:34.486693 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" exitCode=0 Oct 06 16:21:34 crc kubenswrapper[4763]: I1006 16:21:34.486750 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b"} Oct 06 16:21:34 crc kubenswrapper[4763]: I1006 16:21:34.486794 4763 scope.go:117] "RemoveContainer" containerID="19a7c73fae42a6138f9b916df098fa093d1f8b04e8954f8f59d3dfa210830e42" Oct 06 16:21:34 crc kubenswrapper[4763]: I1006 16:21:34.487147 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:21:34 crc kubenswrapper[4763]: E1006 16:21:34.487375 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:21:36 crc kubenswrapper[4763]: I1006 16:21:36.477820 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fl2l9"] Oct 06 16:21:36 crc kubenswrapper[4763]: E1006 16:21:36.478534 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abbbc32d-99b3-4a65-951b-11aa0daaefd5" containerName="init" Oct 06 16:21:36 crc kubenswrapper[4763]: I1006 16:21:36.478551 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="abbbc32d-99b3-4a65-951b-11aa0daaefd5" containerName="init" Oct 06 16:21:36 crc kubenswrapper[4763]: E1006 16:21:36.478569 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abbbc32d-99b3-4a65-951b-11aa0daaefd5" containerName="dnsmasq-dns" Oct 06 16:21:36 crc kubenswrapper[4763]: I1006 16:21:36.478578 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="abbbc32d-99b3-4a65-951b-11aa0daaefd5" containerName="dnsmasq-dns" Oct 06 16:21:36 crc kubenswrapper[4763]: I1006 16:21:36.478815 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="abbbc32d-99b3-4a65-951b-11aa0daaefd5" containerName="dnsmasq-dns" Oct 06 16:21:36 crc kubenswrapper[4763]: I1006 16:21:36.479466 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fl2l9" Oct 06 16:21:36 crc kubenswrapper[4763]: I1006 16:21:36.485913 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fl2l9"] Oct 06 16:21:36 crc kubenswrapper[4763]: I1006 16:21:36.561827 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n26h\" (UniqueName: \"kubernetes.io/projected/3cd968ee-8b72-4352-b0d9-0ef485399021-kube-api-access-9n26h\") pod \"neutron-db-create-fl2l9\" (UID: \"3cd968ee-8b72-4352-b0d9-0ef485399021\") " pod="openstack/neutron-db-create-fl2l9" Oct 06 16:21:36 crc kubenswrapper[4763]: I1006 16:21:36.663944 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n26h\" (UniqueName: \"kubernetes.io/projected/3cd968ee-8b72-4352-b0d9-0ef485399021-kube-api-access-9n26h\") pod \"neutron-db-create-fl2l9\" (UID: \"3cd968ee-8b72-4352-b0d9-0ef485399021\") " pod="openstack/neutron-db-create-fl2l9" Oct 06 16:21:36 crc kubenswrapper[4763]: I1006 16:21:36.696382 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n26h\" (UniqueName: \"kubernetes.io/projected/3cd968ee-8b72-4352-b0d9-0ef485399021-kube-api-access-9n26h\") pod \"neutron-db-create-fl2l9\" (UID: \"3cd968ee-8b72-4352-b0d9-0ef485399021\") " pod="openstack/neutron-db-create-fl2l9" Oct 06 16:21:36 crc kubenswrapper[4763]: I1006 16:21:36.841539 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fl2l9" Oct 06 16:21:37 crc kubenswrapper[4763]: I1006 16:21:37.286914 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fl2l9"] Oct 06 16:21:37 crc kubenswrapper[4763]: I1006 16:21:37.536457 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fl2l9" event={"ID":"3cd968ee-8b72-4352-b0d9-0ef485399021","Type":"ContainerStarted","Data":"986a79dc1cdfd45a05e6879a8f8f967ea7acf4a6abe1a1f33b258cf415cbf8d3"} Oct 06 16:21:37 crc kubenswrapper[4763]: I1006 16:21:37.536518 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fl2l9" event={"ID":"3cd968ee-8b72-4352-b0d9-0ef485399021","Type":"ContainerStarted","Data":"dd5d7e83908d4136ba715e4815f951a6128099115d4eaebf35c0b5735a2df338"} Oct 06 16:21:37 crc kubenswrapper[4763]: I1006 16:21:37.561039 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-fl2l9" podStartSLOduration=1.561015652 podStartE2EDuration="1.561015652s" podCreationTimestamp="2025-10-06 16:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:21:37.556163212 +0000 UTC m=+5294.711455804" watchObservedRunningTime="2025-10-06 16:21:37.561015652 +0000 UTC m=+5294.716308174" Oct 06 16:21:38 crc kubenswrapper[4763]: I1006 16:21:38.548291 4763 generic.go:334] "Generic (PLEG): container finished" podID="3cd968ee-8b72-4352-b0d9-0ef485399021" containerID="986a79dc1cdfd45a05e6879a8f8f967ea7acf4a6abe1a1f33b258cf415cbf8d3" exitCode=0 Oct 06 16:21:38 crc kubenswrapper[4763]: I1006 16:21:38.548366 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fl2l9" event={"ID":"3cd968ee-8b72-4352-b0d9-0ef485399021","Type":"ContainerDied","Data":"986a79dc1cdfd45a05e6879a8f8f967ea7acf4a6abe1a1f33b258cf415cbf8d3"} Oct 06 16:21:39 crc kubenswrapper[4763]: I1006 16:21:39.940573 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fl2l9" Oct 06 16:21:40 crc kubenswrapper[4763]: I1006 16:21:40.023742 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n26h\" (UniqueName: \"kubernetes.io/projected/3cd968ee-8b72-4352-b0d9-0ef485399021-kube-api-access-9n26h\") pod \"3cd968ee-8b72-4352-b0d9-0ef485399021\" (UID: \"3cd968ee-8b72-4352-b0d9-0ef485399021\") " Oct 06 16:21:40 crc kubenswrapper[4763]: I1006 16:21:40.035954 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd968ee-8b72-4352-b0d9-0ef485399021-kube-api-access-9n26h" (OuterVolumeSpecName: "kube-api-access-9n26h") pod "3cd968ee-8b72-4352-b0d9-0ef485399021" (UID: "3cd968ee-8b72-4352-b0d9-0ef485399021"). InnerVolumeSpecName "kube-api-access-9n26h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:21:40 crc kubenswrapper[4763]: I1006 16:21:40.126457 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n26h\" (UniqueName: \"kubernetes.io/projected/3cd968ee-8b72-4352-b0d9-0ef485399021-kube-api-access-9n26h\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:40 crc kubenswrapper[4763]: I1006 16:21:40.575155 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fl2l9" event={"ID":"3cd968ee-8b72-4352-b0d9-0ef485399021","Type":"ContainerDied","Data":"dd5d7e83908d4136ba715e4815f951a6128099115d4eaebf35c0b5735a2df338"} Oct 06 16:21:40 crc kubenswrapper[4763]: I1006 16:21:40.575286 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd5d7e83908d4136ba715e4815f951a6128099115d4eaebf35c0b5735a2df338" Oct 06 16:21:40 crc kubenswrapper[4763]: I1006 16:21:40.575202 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fl2l9" Oct 06 16:21:46 crc kubenswrapper[4763]: I1006 16:21:46.586565 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a665-account-create-5jwsj"] Oct 06 16:21:46 crc kubenswrapper[4763]: E1006 16:21:46.588033 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd968ee-8b72-4352-b0d9-0ef485399021" containerName="mariadb-database-create" Oct 06 16:21:46 crc kubenswrapper[4763]: I1006 16:21:46.588055 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd968ee-8b72-4352-b0d9-0ef485399021" containerName="mariadb-database-create" Oct 06 16:21:46 crc kubenswrapper[4763]: I1006 16:21:46.588277 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd968ee-8b72-4352-b0d9-0ef485399021" containerName="mariadb-database-create" Oct 06 16:21:46 crc kubenswrapper[4763]: I1006 16:21:46.589180 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a665-account-create-5jwsj" Oct 06 16:21:46 crc kubenswrapper[4763]: I1006 16:21:46.593428 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 06 16:21:46 crc kubenswrapper[4763]: I1006 16:21:46.599056 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a665-account-create-5jwsj"] Oct 06 16:21:46 crc kubenswrapper[4763]: I1006 16:21:46.683450 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmm7t\" (UniqueName: \"kubernetes.io/projected/5ece696e-b1c4-41e9-8067-7f86114e4cbe-kube-api-access-fmm7t\") pod \"neutron-a665-account-create-5jwsj\" (UID: \"5ece696e-b1c4-41e9-8067-7f86114e4cbe\") " pod="openstack/neutron-a665-account-create-5jwsj" Oct 06 16:21:46 crc kubenswrapper[4763]: I1006 16:21:46.785129 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmm7t\" (UniqueName: \"kubernetes.io/projected/5ece696e-b1c4-41e9-8067-7f86114e4cbe-kube-api-access-fmm7t\") pod \"neutron-a665-account-create-5jwsj\" (UID: \"5ece696e-b1c4-41e9-8067-7f86114e4cbe\") " pod="openstack/neutron-a665-account-create-5jwsj" Oct 06 16:21:46 crc kubenswrapper[4763]: I1006 16:21:46.815352 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmm7t\" (UniqueName: \"kubernetes.io/projected/5ece696e-b1c4-41e9-8067-7f86114e4cbe-kube-api-access-fmm7t\") pod \"neutron-a665-account-create-5jwsj\" (UID: \"5ece696e-b1c4-41e9-8067-7f86114e4cbe\") " pod="openstack/neutron-a665-account-create-5jwsj" Oct 06 16:21:46 crc kubenswrapper[4763]: I1006 16:21:46.944022 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a665-account-create-5jwsj" Oct 06 16:21:47 crc kubenswrapper[4763]: I1006 16:21:47.446257 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a665-account-create-5jwsj"] Oct 06 16:21:47 crc kubenswrapper[4763]: I1006 16:21:47.676485 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a665-account-create-5jwsj" event={"ID":"5ece696e-b1c4-41e9-8067-7f86114e4cbe","Type":"ContainerStarted","Data":"462c3d4b549efd1e59fb3371977791ca076e53388f28fca8b2950c6e916d5077"} Oct 06 16:21:47 crc kubenswrapper[4763]: I1006 16:21:47.677822 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a665-account-create-5jwsj" event={"ID":"5ece696e-b1c4-41e9-8067-7f86114e4cbe","Type":"ContainerStarted","Data":"d26b409d57c334b1827f1509887b88bfc12852e0913f07a2005dd2009d97d91e"} Oct 06 16:21:47 crc kubenswrapper[4763]: I1006 16:21:47.698423 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-a665-account-create-5jwsj" podStartSLOduration=1.698392119 podStartE2EDuration="1.698392119s" podCreationTimestamp="2025-10-06 16:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:21:47.692763228 +0000 UTC m=+5304.848055740" watchObservedRunningTime="2025-10-06 16:21:47.698392119 +0000 UTC m=+5304.853684621" Oct 06 16:21:48 crc kubenswrapper[4763]: I1006 16:21:48.688187 4763 generic.go:334] "Generic (PLEG): container finished" podID="5ece696e-b1c4-41e9-8067-7f86114e4cbe" containerID="462c3d4b549efd1e59fb3371977791ca076e53388f28fca8b2950c6e916d5077" exitCode=0 Oct 06 16:21:48 crc kubenswrapper[4763]: I1006 16:21:48.688245 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a665-account-create-5jwsj" event={"ID":"5ece696e-b1c4-41e9-8067-7f86114e4cbe","Type":"ContainerDied","Data":"462c3d4b549efd1e59fb3371977791ca076e53388f28fca8b2950c6e916d5077"} Oct 06 16:21:49 crc kubenswrapper[4763]: I1006 16:21:49.575652 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:21:49 crc kubenswrapper[4763]: E1006 16:21:49.576262 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:21:50 crc kubenswrapper[4763]: I1006 16:21:50.072241 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a665-account-create-5jwsj" Oct 06 16:21:50 crc kubenswrapper[4763]: I1006 16:21:50.157139 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmm7t\" (UniqueName: \"kubernetes.io/projected/5ece696e-b1c4-41e9-8067-7f86114e4cbe-kube-api-access-fmm7t\") pod \"5ece696e-b1c4-41e9-8067-7f86114e4cbe\" (UID: \"5ece696e-b1c4-41e9-8067-7f86114e4cbe\") " Oct 06 16:21:50 crc kubenswrapper[4763]: I1006 16:21:50.163764 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ece696e-b1c4-41e9-8067-7f86114e4cbe-kube-api-access-fmm7t" (OuterVolumeSpecName: "kube-api-access-fmm7t") pod "5ece696e-b1c4-41e9-8067-7f86114e4cbe" (UID: "5ece696e-b1c4-41e9-8067-7f86114e4cbe"). InnerVolumeSpecName "kube-api-access-fmm7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:21:50 crc kubenswrapper[4763]: I1006 16:21:50.259745 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmm7t\" (UniqueName: \"kubernetes.io/projected/5ece696e-b1c4-41e9-8067-7f86114e4cbe-kube-api-access-fmm7t\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:50 crc kubenswrapper[4763]: I1006 16:21:50.711397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a665-account-create-5jwsj" event={"ID":"5ece696e-b1c4-41e9-8067-7f86114e4cbe","Type":"ContainerDied","Data":"d26b409d57c334b1827f1509887b88bfc12852e0913f07a2005dd2009d97d91e"} Oct 06 16:21:50 crc kubenswrapper[4763]: I1006 16:21:50.711444 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d26b409d57c334b1827f1509887b88bfc12852e0913f07a2005dd2009d97d91e" Oct 06 16:21:50 crc kubenswrapper[4763]: I1006 16:21:50.711520 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a665-account-create-5jwsj" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.717147 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kf4p9"] Oct 06 16:21:51 crc kubenswrapper[4763]: E1006 16:21:51.717882 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ece696e-b1c4-41e9-8067-7f86114e4cbe" containerName="mariadb-account-create" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.717899 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ece696e-b1c4-41e9-8067-7f86114e4cbe" containerName="mariadb-account-create" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.718149 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ece696e-b1c4-41e9-8067-7f86114e4cbe" containerName="mariadb-account-create" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.718880 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.720574 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.728243 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.728495 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hdr67" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.739382 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kf4p9"] Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.789076 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-config\") pod \"neutron-db-sync-kf4p9\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.789156 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf6jc\" (UniqueName: \"kubernetes.io/projected/4a71e1f5-866b-4034-808f-878e473fbbc3-kube-api-access-nf6jc\") pod \"neutron-db-sync-kf4p9\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.789272 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-combined-ca-bundle\") pod \"neutron-db-sync-kf4p9\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.891383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-config\") pod \"neutron-db-sync-kf4p9\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.891509 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf6jc\" (UniqueName: \"kubernetes.io/projected/4a71e1f5-866b-4034-808f-878e473fbbc3-kube-api-access-nf6jc\") pod \"neutron-db-sync-kf4p9\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.891569 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-combined-ca-bundle\") pod \"neutron-db-sync-kf4p9\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.901839 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-combined-ca-bundle\") pod \"neutron-db-sync-kf4p9\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.901969 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-config\") pod \"neutron-db-sync-kf4p9\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:51 crc kubenswrapper[4763]: I1006 16:21:51.910034 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf6jc\" (UniqueName: \"kubernetes.io/projected/4a71e1f5-866b-4034-808f-878e473fbbc3-kube-api-access-nf6jc\") pod \"neutron-db-sync-kf4p9\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:52 crc kubenswrapper[4763]: I1006 16:21:52.036139 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:52 crc kubenswrapper[4763]: I1006 16:21:52.481952 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kf4p9"] Oct 06 16:21:52 crc kubenswrapper[4763]: W1006 16:21:52.499917 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a71e1f5_866b_4034_808f_878e473fbbc3.slice/crio-84a75e5b161eba23a1f3c88ba5b248d9e00d74c77d17a7f044869edc9f6f7451 WatchSource:0}: Error finding container 84a75e5b161eba23a1f3c88ba5b248d9e00d74c77d17a7f044869edc9f6f7451: Status 404 returned error can't find the container with id 84a75e5b161eba23a1f3c88ba5b248d9e00d74c77d17a7f044869edc9f6f7451 Oct 06 16:21:52 crc kubenswrapper[4763]: I1006 16:21:52.727786 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kf4p9" event={"ID":"4a71e1f5-866b-4034-808f-878e473fbbc3","Type":"ContainerStarted","Data":"243d4b03a3e8dd986745b8e2717056948bae1c9aa97fbe23e30d58d41c8eed2d"} Oct 06 16:21:52 crc kubenswrapper[4763]: I1006 16:21:52.727839 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kf4p9" event={"ID":"4a71e1f5-866b-4034-808f-878e473fbbc3","Type":"ContainerStarted","Data":"84a75e5b161eba23a1f3c88ba5b248d9e00d74c77d17a7f044869edc9f6f7451"} Oct 06 16:21:52 crc kubenswrapper[4763]: I1006 16:21:52.749232 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kf4p9" podStartSLOduration=1.7492099030000001 podStartE2EDuration="1.749209903s" podCreationTimestamp="2025-10-06 16:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:21:52.74573982 +0000 UTC m=+5309.901032352" watchObservedRunningTime="2025-10-06 16:21:52.749209903 +0000 UTC m=+5309.904502415" Oct 06 16:21:56 crc kubenswrapper[4763]: I1006 16:21:56.768258 4763 generic.go:334] "Generic (PLEG): container finished" podID="4a71e1f5-866b-4034-808f-878e473fbbc3" containerID="243d4b03a3e8dd986745b8e2717056948bae1c9aa97fbe23e30d58d41c8eed2d" exitCode=0 Oct 06 16:21:56 crc kubenswrapper[4763]: I1006 16:21:56.768392 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kf4p9" event={"ID":"4a71e1f5-866b-4034-808f-878e473fbbc3","Type":"ContainerDied","Data":"243d4b03a3e8dd986745b8e2717056948bae1c9aa97fbe23e30d58d41c8eed2d"} Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.203170 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.338563 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-config\") pod \"4a71e1f5-866b-4034-808f-878e473fbbc3\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.338723 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-combined-ca-bundle\") pod \"4a71e1f5-866b-4034-808f-878e473fbbc3\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.338795 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf6jc\" (UniqueName: \"kubernetes.io/projected/4a71e1f5-866b-4034-808f-878e473fbbc3-kube-api-access-nf6jc\") pod \"4a71e1f5-866b-4034-808f-878e473fbbc3\" (UID: \"4a71e1f5-866b-4034-808f-878e473fbbc3\") " Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.345548 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a71e1f5-866b-4034-808f-878e473fbbc3-kube-api-access-nf6jc" (OuterVolumeSpecName: "kube-api-access-nf6jc") pod "4a71e1f5-866b-4034-808f-878e473fbbc3" (UID: "4a71e1f5-866b-4034-808f-878e473fbbc3"). InnerVolumeSpecName "kube-api-access-nf6jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.363588 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-config" (OuterVolumeSpecName: "config") pod "4a71e1f5-866b-4034-808f-878e473fbbc3" (UID: "4a71e1f5-866b-4034-808f-878e473fbbc3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.364430 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a71e1f5-866b-4034-808f-878e473fbbc3" (UID: "4a71e1f5-866b-4034-808f-878e473fbbc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.441079 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf6jc\" (UniqueName: \"kubernetes.io/projected/4a71e1f5-866b-4034-808f-878e473fbbc3-kube-api-access-nf6jc\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.441147 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.441168 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a71e1f5-866b-4034-808f-878e473fbbc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.791148 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kf4p9" event={"ID":"4a71e1f5-866b-4034-808f-878e473fbbc3","Type":"ContainerDied","Data":"84a75e5b161eba23a1f3c88ba5b248d9e00d74c77d17a7f044869edc9f6f7451"} Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.791206 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84a75e5b161eba23a1f3c88ba5b248d9e00d74c77d17a7f044869edc9f6f7451" Oct 06 16:21:58 crc kubenswrapper[4763]: I1006 16:21:58.791296 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kf4p9" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.094378 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-6727t"] Oct 06 16:21:59 crc kubenswrapper[4763]: E1006 16:21:59.095741 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a71e1f5-866b-4034-808f-878e473fbbc3" containerName="neutron-db-sync" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.095770 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a71e1f5-866b-4034-808f-878e473fbbc3" containerName="neutron-db-sync" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.095982 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a71e1f5-866b-4034-808f-878e473fbbc3" containerName="neutron-db-sync" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.101558 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.111374 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-6727t"] Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.163439 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-765d7995c-g9879"] Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.165102 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.169098 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.169877 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hdr67" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.170858 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.192552 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-765d7995c-g9879"] Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.274826 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9lbc\" (UniqueName: \"kubernetes.io/projected/74d03f25-9223-4fd7-b049-a4f63399b6c6-kube-api-access-n9lbc\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.274874 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-config\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.274894 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-sb\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.274912 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d03f25-9223-4fd7-b049-a4f63399b6c6-combined-ca-bundle\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.274937 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-dns-svc\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.274974 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74d03f25-9223-4fd7-b049-a4f63399b6c6-httpd-config\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.274993 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-nb\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.275023 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74d03f25-9223-4fd7-b049-a4f63399b6c6-config\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.275050 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbcft\" (UniqueName: \"kubernetes.io/projected/6e4bac65-de72-45e5-8dfe-495462ccc48c-kube-api-access-xbcft\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.377268 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbcft\" (UniqueName: \"kubernetes.io/projected/6e4bac65-de72-45e5-8dfe-495462ccc48c-kube-api-access-xbcft\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.377404 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9lbc\" (UniqueName: \"kubernetes.io/projected/74d03f25-9223-4fd7-b049-a4f63399b6c6-kube-api-access-n9lbc\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.377705 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-config\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.377737 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-sb\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.378701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-sb\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.378871 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-config\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.378950 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d03f25-9223-4fd7-b049-a4f63399b6c6-combined-ca-bundle\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.378979 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-dns-svc\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.379673 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-dns-svc\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.379781 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74d03f25-9223-4fd7-b049-a4f63399b6c6-httpd-config\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.379809 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-nb\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.379866 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74d03f25-9223-4fd7-b049-a4f63399b6c6-config\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.381509 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-nb\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.384296 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d03f25-9223-4fd7-b049-a4f63399b6c6-combined-ca-bundle\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.385666 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74d03f25-9223-4fd7-b049-a4f63399b6c6-httpd-config\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.385947 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/74d03f25-9223-4fd7-b049-a4f63399b6c6-config\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.394151 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbcft\" (UniqueName: \"kubernetes.io/projected/6e4bac65-de72-45e5-8dfe-495462ccc48c-kube-api-access-xbcft\") pod \"dnsmasq-dns-94d77d5bf-6727t\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.407936 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9lbc\" (UniqueName: \"kubernetes.io/projected/74d03f25-9223-4fd7-b049-a4f63399b6c6-kube-api-access-n9lbc\") pod \"neutron-765d7995c-g9879\" (UID: \"74d03f25-9223-4fd7-b049-a4f63399b6c6\") " pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.438170 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.491803 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-765d7995c-g9879" Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.721724 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-6727t"] Oct 06 16:21:59 crc kubenswrapper[4763]: I1006 16:21:59.809274 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" event={"ID":"6e4bac65-de72-45e5-8dfe-495462ccc48c","Type":"ContainerStarted","Data":"db1a79e3d398848d31708fcaa5272b6bae42bc2b6f3788a081dc8253e3ced807"} Oct 06 16:22:00 crc kubenswrapper[4763]: I1006 16:22:00.110600 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-765d7995c-g9879"] Oct 06 16:22:00 crc kubenswrapper[4763]: W1006 16:22:00.112705 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74d03f25_9223_4fd7_b049_a4f63399b6c6.slice/crio-d7b3043d425a7d73ab1190d5edff6601ab96f47f3b0cdac510176636341ddf4c WatchSource:0}: Error finding container d7b3043d425a7d73ab1190d5edff6601ab96f47f3b0cdac510176636341ddf4c: Status 404 returned error can't find the container with id d7b3043d425a7d73ab1190d5edff6601ab96f47f3b0cdac510176636341ddf4c Oct 06 16:22:00 crc kubenswrapper[4763]: I1006 16:22:00.576130 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:22:00 crc kubenswrapper[4763]: E1006 16:22:00.576580 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:22:00 crc kubenswrapper[4763]: I1006 16:22:00.820082 4763 generic.go:334] "Generic (PLEG): container finished" podID="6e4bac65-de72-45e5-8dfe-495462ccc48c" containerID="2703d244d3da143b8b64b50ede21d1e00a3099322abdc7a409d8dc9a315a375a" exitCode=0 Oct 06 16:22:00 crc kubenswrapper[4763]: I1006 16:22:00.820172 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" event={"ID":"6e4bac65-de72-45e5-8dfe-495462ccc48c","Type":"ContainerDied","Data":"2703d244d3da143b8b64b50ede21d1e00a3099322abdc7a409d8dc9a315a375a"} Oct 06 16:22:00 crc kubenswrapper[4763]: I1006 16:22:00.823375 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-765d7995c-g9879" event={"ID":"74d03f25-9223-4fd7-b049-a4f63399b6c6","Type":"ContainerStarted","Data":"ae70f5a50d899ae2bbc1d622eefcd500db905c5f6d2301497e138b36d8f91d98"} Oct 06 16:22:00 crc kubenswrapper[4763]: I1006 16:22:00.823419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-765d7995c-g9879" event={"ID":"74d03f25-9223-4fd7-b049-a4f63399b6c6","Type":"ContainerStarted","Data":"d8ca22cde1a791cd4cf4af1b8d674dfe708b2751f4e57c8175b4c335f5b551b9"} Oct 06 16:22:00 crc kubenswrapper[4763]: I1006 16:22:00.823433 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-765d7995c-g9879" event={"ID":"74d03f25-9223-4fd7-b049-a4f63399b6c6","Type":"ContainerStarted","Data":"d7b3043d425a7d73ab1190d5edff6601ab96f47f3b0cdac510176636341ddf4c"} Oct 06 16:22:00 crc kubenswrapper[4763]: I1006 16:22:00.823575 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-765d7995c-g9879" Oct 06 16:22:00 crc kubenswrapper[4763]: I1006 16:22:00.880390 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-765d7995c-g9879" podStartSLOduration=1.880369038 podStartE2EDuration="1.880369038s" podCreationTimestamp="2025-10-06 16:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:22:00.873369021 +0000 UTC m=+5318.028661553" watchObservedRunningTime="2025-10-06 16:22:00.880369038 +0000 UTC m=+5318.035661550" Oct 06 16:22:01 crc kubenswrapper[4763]: I1006 16:22:01.830552 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" event={"ID":"6e4bac65-de72-45e5-8dfe-495462ccc48c","Type":"ContainerStarted","Data":"f51f9d060969c20bd21e3415bd1dae553b3c25613b8d2b82a29e2ec8e552372b"} Oct 06 16:22:01 crc kubenswrapper[4763]: I1006 16:22:01.858331 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" podStartSLOduration=2.858314591 podStartE2EDuration="2.858314591s" podCreationTimestamp="2025-10-06 16:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:22:01.849788082 +0000 UTC m=+5319.005080594" watchObservedRunningTime="2025-10-06 16:22:01.858314591 +0000 UTC m=+5319.013607103" Oct 06 16:22:02 crc kubenswrapper[4763]: I1006 16:22:02.839488 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:22:09 crc kubenswrapper[4763]: I1006 16:22:09.439855 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:22:09 crc kubenswrapper[4763]: I1006 16:22:09.514025 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-9skjc"] Oct 06 16:22:09 crc kubenswrapper[4763]: I1006 16:22:09.515771 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" podUID="d86151f8-f0a4-4b49-ad2d-a33db299831a" containerName="dnsmasq-dns" containerID="cri-o://c3cb9211955b463705a862764ae1f834f11bcbdb04c993625b8114b6ecb4d340" gracePeriod=10 Oct 06 16:22:09 crc kubenswrapper[4763]: I1006 16:22:09.941671 4763 generic.go:334] "Generic (PLEG): container finished" podID="d86151f8-f0a4-4b49-ad2d-a33db299831a" containerID="c3cb9211955b463705a862764ae1f834f11bcbdb04c993625b8114b6ecb4d340" exitCode=0 Oct 06 16:22:09 crc kubenswrapper[4763]: I1006 16:22:09.942000 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" event={"ID":"d86151f8-f0a4-4b49-ad2d-a33db299831a","Type":"ContainerDied","Data":"c3cb9211955b463705a862764ae1f834f11bcbdb04c993625b8114b6ecb4d340"} Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.097061 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.192827 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-config\") pod \"d86151f8-f0a4-4b49-ad2d-a33db299831a\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.192895 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-nb\") pod \"d86151f8-f0a4-4b49-ad2d-a33db299831a\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.192924 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-sb\") pod \"d86151f8-f0a4-4b49-ad2d-a33db299831a\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.192968 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm2z2\" (UniqueName: \"kubernetes.io/projected/d86151f8-f0a4-4b49-ad2d-a33db299831a-kube-api-access-nm2z2\") pod \"d86151f8-f0a4-4b49-ad2d-a33db299831a\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.193021 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-dns-svc\") pod \"d86151f8-f0a4-4b49-ad2d-a33db299831a\" (UID: \"d86151f8-f0a4-4b49-ad2d-a33db299831a\") " Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.198971 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d86151f8-f0a4-4b49-ad2d-a33db299831a-kube-api-access-nm2z2" (OuterVolumeSpecName: "kube-api-access-nm2z2") pod "d86151f8-f0a4-4b49-ad2d-a33db299831a" (UID: "d86151f8-f0a4-4b49-ad2d-a33db299831a"). InnerVolumeSpecName "kube-api-access-nm2z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.232187 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-config" (OuterVolumeSpecName: "config") pod "d86151f8-f0a4-4b49-ad2d-a33db299831a" (UID: "d86151f8-f0a4-4b49-ad2d-a33db299831a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.239655 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d86151f8-f0a4-4b49-ad2d-a33db299831a" (UID: "d86151f8-f0a4-4b49-ad2d-a33db299831a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.241791 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d86151f8-f0a4-4b49-ad2d-a33db299831a" (UID: "d86151f8-f0a4-4b49-ad2d-a33db299831a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.255230 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d86151f8-f0a4-4b49-ad2d-a33db299831a" (UID: "d86151f8-f0a4-4b49-ad2d-a33db299831a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.294423 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.294458 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.294469 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.294479 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d86151f8-f0a4-4b49-ad2d-a33db299831a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.294489 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm2z2\" (UniqueName: \"kubernetes.io/projected/d86151f8-f0a4-4b49-ad2d-a33db299831a-kube-api-access-nm2z2\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.951833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" event={"ID":"d86151f8-f0a4-4b49-ad2d-a33db299831a","Type":"ContainerDied","Data":"545f42b2ba67dfb9b6a1d1727c169212796a332b3d3da9b73617d3312e151d40"} Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.951878 4763 scope.go:117] "RemoveContainer" containerID="c3cb9211955b463705a862764ae1f834f11bcbdb04c993625b8114b6ecb4d340" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.951908 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869545f9c9-9skjc" Oct 06 16:22:10 crc kubenswrapper[4763]: I1006 16:22:10.986510 4763 scope.go:117] "RemoveContainer" containerID="1a668babf2a6aab7be3658fd6e681a2ac42e79c807e4d91b9ffcaf3fe0049bc4" Oct 06 16:22:11 crc kubenswrapper[4763]: I1006 16:22:11.014545 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-9skjc"] Oct 06 16:22:11 crc kubenswrapper[4763]: I1006 16:22:11.020539 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-9skjc"] Oct 06 16:22:11 crc kubenswrapper[4763]: I1006 16:22:11.575841 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:22:11 crc kubenswrapper[4763]: E1006 16:22:11.576843 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:22:11 crc kubenswrapper[4763]: I1006 16:22:11.586366 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d86151f8-f0a4-4b49-ad2d-a33db299831a" path="/var/lib/kubelet/pods/d86151f8-f0a4-4b49-ad2d-a33db299831a/volumes" Oct 06 16:22:23 crc kubenswrapper[4763]: I1006 16:22:23.583241 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:22:23 crc kubenswrapper[4763]: E1006 16:22:23.584279 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:22:29 crc kubenswrapper[4763]: I1006 16:22:29.502048 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-765d7995c-g9879" Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.285217 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lsp4w"] Oct 06 16:22:37 crc kubenswrapper[4763]: E1006 16:22:37.286189 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d86151f8-f0a4-4b49-ad2d-a33db299831a" containerName="init" Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.286206 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d86151f8-f0a4-4b49-ad2d-a33db299831a" containerName="init" Oct 06 16:22:37 crc kubenswrapper[4763]: E1006 16:22:37.286220 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d86151f8-f0a4-4b49-ad2d-a33db299831a" containerName="dnsmasq-dns" Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.286228 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d86151f8-f0a4-4b49-ad2d-a33db299831a" containerName="dnsmasq-dns" Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.295019 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d86151f8-f0a4-4b49-ad2d-a33db299831a" containerName="dnsmasq-dns" Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.295845 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lsp4w" Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.304438 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lsp4w"] Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.417685 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76qln\" (UniqueName: \"kubernetes.io/projected/753794e2-1eed-49c0-81d0-b684b31e1986-kube-api-access-76qln\") pod \"glance-db-create-lsp4w\" (UID: \"753794e2-1eed-49c0-81d0-b684b31e1986\") " pod="openstack/glance-db-create-lsp4w" Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.519669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76qln\" (UniqueName: \"kubernetes.io/projected/753794e2-1eed-49c0-81d0-b684b31e1986-kube-api-access-76qln\") pod \"glance-db-create-lsp4w\" (UID: \"753794e2-1eed-49c0-81d0-b684b31e1986\") " pod="openstack/glance-db-create-lsp4w" Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.545813 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76qln\" (UniqueName: \"kubernetes.io/projected/753794e2-1eed-49c0-81d0-b684b31e1986-kube-api-access-76qln\") pod \"glance-db-create-lsp4w\" (UID: \"753794e2-1eed-49c0-81d0-b684b31e1986\") " pod="openstack/glance-db-create-lsp4w" Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.576132 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:22:37 crc kubenswrapper[4763]: E1006 16:22:37.576554 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:22:37 crc kubenswrapper[4763]: I1006 16:22:37.622738 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lsp4w" Oct 06 16:22:38 crc kubenswrapper[4763]: I1006 16:22:38.129322 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lsp4w"] Oct 06 16:22:38 crc kubenswrapper[4763]: I1006 16:22:38.214810 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lsp4w" event={"ID":"753794e2-1eed-49c0-81d0-b684b31e1986","Type":"ContainerStarted","Data":"532639efa2b36ee5bb4b901527c257a9bbe72379da409ed1ad67a9ca10a0a9d5"} Oct 06 16:22:39 crc kubenswrapper[4763]: I1006 16:22:39.231537 4763 generic.go:334] "Generic (PLEG): container finished" podID="753794e2-1eed-49c0-81d0-b684b31e1986" containerID="76f4c1ca7885bb77c9251f86fef173e79da477b24560f096aeb2a74b07aa4025" exitCode=0 Oct 06 16:22:39 crc kubenswrapper[4763]: I1006 16:22:39.231669 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lsp4w" event={"ID":"753794e2-1eed-49c0-81d0-b684b31e1986","Type":"ContainerDied","Data":"76f4c1ca7885bb77c9251f86fef173e79da477b24560f096aeb2a74b07aa4025"} Oct 06 16:22:40 crc kubenswrapper[4763]: I1006 16:22:40.525850 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lsp4w" Oct 06 16:22:40 crc kubenswrapper[4763]: I1006 16:22:40.679092 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76qln\" (UniqueName: \"kubernetes.io/projected/753794e2-1eed-49c0-81d0-b684b31e1986-kube-api-access-76qln\") pod \"753794e2-1eed-49c0-81d0-b684b31e1986\" (UID: \"753794e2-1eed-49c0-81d0-b684b31e1986\") " Oct 06 16:22:40 crc kubenswrapper[4763]: I1006 16:22:40.686544 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753794e2-1eed-49c0-81d0-b684b31e1986-kube-api-access-76qln" (OuterVolumeSpecName: "kube-api-access-76qln") pod "753794e2-1eed-49c0-81d0-b684b31e1986" (UID: "753794e2-1eed-49c0-81d0-b684b31e1986"). InnerVolumeSpecName "kube-api-access-76qln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:22:40 crc kubenswrapper[4763]: I1006 16:22:40.781462 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76qln\" (UniqueName: \"kubernetes.io/projected/753794e2-1eed-49c0-81d0-b684b31e1986-kube-api-access-76qln\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:41 crc kubenswrapper[4763]: I1006 16:22:41.249891 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lsp4w" event={"ID":"753794e2-1eed-49c0-81d0-b684b31e1986","Type":"ContainerDied","Data":"532639efa2b36ee5bb4b901527c257a9bbe72379da409ed1ad67a9ca10a0a9d5"} Oct 06 16:22:41 crc kubenswrapper[4763]: I1006 16:22:41.250510 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="532639efa2b36ee5bb4b901527c257a9bbe72379da409ed1ad67a9ca10a0a9d5" Oct 06 16:22:41 crc kubenswrapper[4763]: I1006 16:22:41.250016 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lsp4w" Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.399903 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5730-account-create-f2k7w"] Oct 06 16:22:47 crc kubenswrapper[4763]: E1006 16:22:47.400846 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753794e2-1eed-49c0-81d0-b684b31e1986" containerName="mariadb-database-create" Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.400862 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="753794e2-1eed-49c0-81d0-b684b31e1986" containerName="mariadb-database-create" Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.401089 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="753794e2-1eed-49c0-81d0-b684b31e1986" containerName="mariadb-database-create" Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.401793 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5730-account-create-f2k7w" Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.404167 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.410989 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5730-account-create-f2k7w"] Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.503674 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcbnb\" (UniqueName: \"kubernetes.io/projected/358ef27a-74d6-44d0-baf7-d8576b31fb47-kube-api-access-zcbnb\") pod \"glance-5730-account-create-f2k7w\" (UID: \"358ef27a-74d6-44d0-baf7-d8576b31fb47\") " pod="openstack/glance-5730-account-create-f2k7w" Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.605744 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcbnb\" (UniqueName: \"kubernetes.io/projected/358ef27a-74d6-44d0-baf7-d8576b31fb47-kube-api-access-zcbnb\") pod \"glance-5730-account-create-f2k7w\" (UID: \"358ef27a-74d6-44d0-baf7-d8576b31fb47\") " pod="openstack/glance-5730-account-create-f2k7w" Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.624650 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcbnb\" (UniqueName: \"kubernetes.io/projected/358ef27a-74d6-44d0-baf7-d8576b31fb47-kube-api-access-zcbnb\") pod \"glance-5730-account-create-f2k7w\" (UID: \"358ef27a-74d6-44d0-baf7-d8576b31fb47\") " pod="openstack/glance-5730-account-create-f2k7w" Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.723261 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5730-account-create-f2k7w" Oct 06 16:22:47 crc kubenswrapper[4763]: I1006 16:22:47.973210 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5730-account-create-f2k7w"] Oct 06 16:22:48 crc kubenswrapper[4763]: I1006 16:22:48.316111 4763 generic.go:334] "Generic (PLEG): container finished" podID="358ef27a-74d6-44d0-baf7-d8576b31fb47" containerID="b4a652916d05bb44086cf712da3e1beeb22b83c95504b2a19efe93c4b3b29d03" exitCode=0 Oct 06 16:22:48 crc kubenswrapper[4763]: I1006 16:22:48.316233 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5730-account-create-f2k7w" event={"ID":"358ef27a-74d6-44d0-baf7-d8576b31fb47","Type":"ContainerDied","Data":"b4a652916d05bb44086cf712da3e1beeb22b83c95504b2a19efe93c4b3b29d03"} Oct 06 16:22:48 crc kubenswrapper[4763]: I1006 16:22:48.316590 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5730-account-create-f2k7w" event={"ID":"358ef27a-74d6-44d0-baf7-d8576b31fb47","Type":"ContainerStarted","Data":"38ccb800ea71a8965841ab5ceb82fa903d104f2eea6a3f293eb6524a6e28a46d"} Oct 06 16:22:49 crc kubenswrapper[4763]: I1006 16:22:49.602849 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5730-account-create-f2k7w" Oct 06 16:22:49 crc kubenswrapper[4763]: I1006 16:22:49.644714 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcbnb\" (UniqueName: \"kubernetes.io/projected/358ef27a-74d6-44d0-baf7-d8576b31fb47-kube-api-access-zcbnb\") pod \"358ef27a-74d6-44d0-baf7-d8576b31fb47\" (UID: \"358ef27a-74d6-44d0-baf7-d8576b31fb47\") " Oct 06 16:22:49 crc kubenswrapper[4763]: I1006 16:22:49.650272 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/358ef27a-74d6-44d0-baf7-d8576b31fb47-kube-api-access-zcbnb" (OuterVolumeSpecName: "kube-api-access-zcbnb") pod "358ef27a-74d6-44d0-baf7-d8576b31fb47" (UID: "358ef27a-74d6-44d0-baf7-d8576b31fb47"). InnerVolumeSpecName "kube-api-access-zcbnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:22:49 crc kubenswrapper[4763]: I1006 16:22:49.747403 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcbnb\" (UniqueName: \"kubernetes.io/projected/358ef27a-74d6-44d0-baf7-d8576b31fb47-kube-api-access-zcbnb\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:50 crc kubenswrapper[4763]: I1006 16:22:50.335863 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5730-account-create-f2k7w" event={"ID":"358ef27a-74d6-44d0-baf7-d8576b31fb47","Type":"ContainerDied","Data":"38ccb800ea71a8965841ab5ceb82fa903d104f2eea6a3f293eb6524a6e28a46d"} Oct 06 16:22:50 crc kubenswrapper[4763]: I1006 16:22:50.336205 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38ccb800ea71a8965841ab5ceb82fa903d104f2eea6a3f293eb6524a6e28a46d" Oct 06 16:22:50 crc kubenswrapper[4763]: I1006 16:22:50.335910 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5730-account-create-f2k7w" Oct 06 16:22:50 crc kubenswrapper[4763]: I1006 16:22:50.575731 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:22:50 crc kubenswrapper[4763]: E1006 16:22:50.576243 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.619781 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qg9qj"] Oct 06 16:22:52 crc kubenswrapper[4763]: E1006 16:22:52.620304 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="358ef27a-74d6-44d0-baf7-d8576b31fb47" containerName="mariadb-account-create" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.620315 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="358ef27a-74d6-44d0-baf7-d8576b31fb47" containerName="mariadb-account-create" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.620475 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="358ef27a-74d6-44d0-baf7-d8576b31fb47" containerName="mariadb-account-create" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.621000 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.623569 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.629157 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-th44z" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.631752 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qg9qj"] Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.705563 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-config-data\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.705704 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-combined-ca-bundle\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.706138 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnt7d\" (UniqueName: \"kubernetes.io/projected/b29e5a26-c005-4cdf-9483-131828c01169-kube-api-access-dnt7d\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.706666 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-db-sync-config-data\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.808462 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-db-sync-config-data\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.808564 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-config-data\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.808642 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-combined-ca-bundle\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.808731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnt7d\" (UniqueName: \"kubernetes.io/projected/b29e5a26-c005-4cdf-9483-131828c01169-kube-api-access-dnt7d\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.820640 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-combined-ca-bundle\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.820703 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-config-data\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.827879 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-db-sync-config-data\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.845099 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnt7d\" (UniqueName: \"kubernetes.io/projected/b29e5a26-c005-4cdf-9483-131828c01169-kube-api-access-dnt7d\") pod \"glance-db-sync-qg9qj\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:52 crc kubenswrapper[4763]: I1006 16:22:52.969776 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:53 crc kubenswrapper[4763]: I1006 16:22:53.491636 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qg9qj"] Oct 06 16:22:54 crc kubenswrapper[4763]: I1006 16:22:54.372113 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qg9qj" event={"ID":"b29e5a26-c005-4cdf-9483-131828c01169","Type":"ContainerStarted","Data":"2ec638e498b7adc69ad84f3c79770b08ed5166ab27dd7371f6cd6270c43aff50"} Oct 06 16:22:54 crc kubenswrapper[4763]: I1006 16:22:54.372466 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qg9qj" event={"ID":"b29e5a26-c005-4cdf-9483-131828c01169","Type":"ContainerStarted","Data":"9bf272f4e35cf9efb293dc1ec1b8947bafb2dff7f5898101c0428de3a903e4cc"} Oct 06 16:22:54 crc kubenswrapper[4763]: I1006 16:22:54.388244 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qg9qj" podStartSLOduration=2.388220466 podStartE2EDuration="2.388220466s" podCreationTimestamp="2025-10-06 16:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:22:54.38500698 +0000 UTC m=+5371.540299492" watchObservedRunningTime="2025-10-06 16:22:54.388220466 +0000 UTC m=+5371.543512978" Oct 06 16:22:57 crc kubenswrapper[4763]: I1006 16:22:57.408155 4763 generic.go:334] "Generic (PLEG): container finished" podID="b29e5a26-c005-4cdf-9483-131828c01169" containerID="2ec638e498b7adc69ad84f3c79770b08ed5166ab27dd7371f6cd6270c43aff50" exitCode=0 Oct 06 16:22:57 crc kubenswrapper[4763]: I1006 16:22:57.408332 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qg9qj" event={"ID":"b29e5a26-c005-4cdf-9483-131828c01169","Type":"ContainerDied","Data":"2ec638e498b7adc69ad84f3c79770b08ed5166ab27dd7371f6cd6270c43aff50"} Oct 06 16:22:58 crc kubenswrapper[4763]: I1006 16:22:58.900167 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:58 crc kubenswrapper[4763]: I1006 16:22:58.908678 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-db-sync-config-data\") pod \"b29e5a26-c005-4cdf-9483-131828c01169\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " Oct 06 16:22:58 crc kubenswrapper[4763]: I1006 16:22:58.908797 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnt7d\" (UniqueName: \"kubernetes.io/projected/b29e5a26-c005-4cdf-9483-131828c01169-kube-api-access-dnt7d\") pod \"b29e5a26-c005-4cdf-9483-131828c01169\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " Oct 06 16:22:58 crc kubenswrapper[4763]: I1006 16:22:58.908858 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-combined-ca-bundle\") pod \"b29e5a26-c005-4cdf-9483-131828c01169\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " Oct 06 16:22:58 crc kubenswrapper[4763]: I1006 16:22:58.908984 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-config-data\") pod \"b29e5a26-c005-4cdf-9483-131828c01169\" (UID: \"b29e5a26-c005-4cdf-9483-131828c01169\") " Oct 06 16:22:58 crc kubenswrapper[4763]: I1006 16:22:58.914933 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29e5a26-c005-4cdf-9483-131828c01169-kube-api-access-dnt7d" (OuterVolumeSpecName: "kube-api-access-dnt7d") pod "b29e5a26-c005-4cdf-9483-131828c01169" (UID: "b29e5a26-c005-4cdf-9483-131828c01169"). InnerVolumeSpecName "kube-api-access-dnt7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:22:58 crc kubenswrapper[4763]: I1006 16:22:58.914979 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b29e5a26-c005-4cdf-9483-131828c01169" (UID: "b29e5a26-c005-4cdf-9483-131828c01169"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:22:58 crc kubenswrapper[4763]: I1006 16:22:58.944081 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b29e5a26-c005-4cdf-9483-131828c01169" (UID: "b29e5a26-c005-4cdf-9483-131828c01169"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:22:58 crc kubenswrapper[4763]: I1006 16:22:58.966008 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-config-data" (OuterVolumeSpecName: "config-data") pod "b29e5a26-c005-4cdf-9483-131828c01169" (UID: "b29e5a26-c005-4cdf-9483-131828c01169"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.010553 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.010586 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.010598 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnt7d\" (UniqueName: \"kubernetes.io/projected/b29e5a26-c005-4cdf-9483-131828c01169-kube-api-access-dnt7d\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.010608 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29e5a26-c005-4cdf-9483-131828c01169-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.433572 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qg9qj" event={"ID":"b29e5a26-c005-4cdf-9483-131828c01169","Type":"ContainerDied","Data":"9bf272f4e35cf9efb293dc1ec1b8947bafb2dff7f5898101c0428de3a903e4cc"} Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.433681 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bf272f4e35cf9efb293dc1ec1b8947bafb2dff7f5898101c0428de3a903e4cc" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.433814 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qg9qj" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.785624 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:22:59 crc kubenswrapper[4763]: E1006 16:22:59.786350 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29e5a26-c005-4cdf-9483-131828c01169" containerName="glance-db-sync" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.786374 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29e5a26-c005-4cdf-9483-131828c01169" containerName="glance-db-sync" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.786597 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b29e5a26-c005-4cdf-9483-131828c01169" containerName="glance-db-sync" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.787746 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.790159 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.790166 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.790447 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.790569 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-th44z" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.801641 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-scn54"] Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.803510 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.817025 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.838072 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-scn54"] Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.857591 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.859150 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.861951 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.885149 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925247 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925309 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-dns-svc\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925358 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lq6f\" (UniqueName: \"kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-kube-api-access-8lq6f\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925379 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj7ng\" (UniqueName: \"kubernetes.io/projected/aef41853-eb12-41bf-ae34-2f46a8538650-kube-api-access-qj7ng\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925416 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-ceph\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925434 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-sb\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925462 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-config\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925485 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-nb\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925507 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-scripts\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925522 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-logs\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:22:59 crc kubenswrapper[4763]: I1006 16:22:59.925545 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-config-data\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.026965 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-dns-svc\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027043 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027141 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lq6f\" (UniqueName: \"kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-kube-api-access-8lq6f\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027169 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj7ng\" (UniqueName: \"kubernetes.io/projected/aef41853-eb12-41bf-ae34-2f46a8538650-kube-api-access-qj7ng\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027210 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rljmz\" (UniqueName: \"kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-kube-api-access-rljmz\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027240 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-ceph\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027264 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-sb\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027304 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-config\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-nb\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027368 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-scripts\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027388 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-logs\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027422 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027444 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-config-data\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027476 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027510 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027536 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.027577 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.028282 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.028600 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-config\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.029085 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-dns-svc\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.029252 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-nb\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.029403 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-sb\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.029487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-logs\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.035140 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-scripts\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.035234 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-ceph\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.035286 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.038894 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-config-data\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.044187 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj7ng\" (UniqueName: \"kubernetes.io/projected/aef41853-eb12-41bf-ae34-2f46a8538650-kube-api-access-qj7ng\") pod \"dnsmasq-dns-8565f7649c-scn54\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.044815 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lq6f\" (UniqueName: \"kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-kube-api-access-8lq6f\") pod \"glance-default-external-api-0\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.107794 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.128924 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.128967 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.129011 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.129055 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.129089 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rljmz\" (UniqueName: \"kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-kube-api-access-rljmz\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.129170 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.129202 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.131160 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.132115 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.135462 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.135821 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.136383 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.138301 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.146436 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.154865 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rljmz\" (UniqueName: \"kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-kube-api-access-rljmz\") pod \"glance-default-internal-api-0\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.179555 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.683504 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.693820 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-scn54"] Oct 06 16:23:00 crc kubenswrapper[4763]: I1006 16:23:00.818029 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:23:01 crc kubenswrapper[4763]: I1006 16:23:01.090266 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:23:01 crc kubenswrapper[4763]: I1006 16:23:01.452948 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8921620b-3b63-482b-b1c3-8c5def8928fb","Type":"ContainerStarted","Data":"12c47e8e64b48c8564962db23d2092e3fb8f451ef47d5de7a0ef99df6cad8ce6"} Oct 06 16:23:01 crc kubenswrapper[4763]: I1006 16:23:01.455259 4763 generic.go:334] "Generic (PLEG): container finished" podID="aef41853-eb12-41bf-ae34-2f46a8538650" containerID="fecde02cc50d8ffcfc53a83d9581e3d5547a56583c6b35c011f6eda99d1ca6ac" exitCode=0 Oct 06 16:23:01 crc kubenswrapper[4763]: I1006 16:23:01.455523 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8565f7649c-scn54" event={"ID":"aef41853-eb12-41bf-ae34-2f46a8538650","Type":"ContainerDied","Data":"fecde02cc50d8ffcfc53a83d9581e3d5547a56583c6b35c011f6eda99d1ca6ac"} Oct 06 16:23:01 crc kubenswrapper[4763]: I1006 16:23:01.455571 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8565f7649c-scn54" event={"ID":"aef41853-eb12-41bf-ae34-2f46a8538650","Type":"ContainerStarted","Data":"887421a66f7f96e1cdd789672c74b90dd22143488ccb871688a41768855a666d"} Oct 06 16:23:01 crc kubenswrapper[4763]: I1006 16:23:01.457645 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d36c26e8-4ed8-4fa5-85ba-ac923385e411","Type":"ContainerStarted","Data":"44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e"} Oct 06 16:23:01 crc kubenswrapper[4763]: I1006 16:23:01.457674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d36c26e8-4ed8-4fa5-85ba-ac923385e411","Type":"ContainerStarted","Data":"a4dc6e4b33416cd9f9a1d7079fe5700b8b087f7ffa150c71de0826293d412266"} Oct 06 16:23:02 crc kubenswrapper[4763]: I1006 16:23:02.471401 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8921620b-3b63-482b-b1c3-8c5def8928fb","Type":"ContainerStarted","Data":"645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c"} Oct 06 16:23:02 crc kubenswrapper[4763]: I1006 16:23:02.472129 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8921620b-3b63-482b-b1c3-8c5def8928fb","Type":"ContainerStarted","Data":"ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a"} Oct 06 16:23:02 crc kubenswrapper[4763]: I1006 16:23:02.474964 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8565f7649c-scn54" event={"ID":"aef41853-eb12-41bf-ae34-2f46a8538650","Type":"ContainerStarted","Data":"da8961f38f0f7573e8ab3af0f4b8877edeedd4fb20bc0b159b1313280a2bdd79"} Oct 06 16:23:02 crc kubenswrapper[4763]: I1006 16:23:02.475076 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:02 crc kubenswrapper[4763]: I1006 16:23:02.476977 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d36c26e8-4ed8-4fa5-85ba-ac923385e411","Type":"ContainerStarted","Data":"5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c"} Oct 06 16:23:02 crc kubenswrapper[4763]: I1006 16:23:02.477150 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" containerName="glance-log" containerID="cri-o://44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e" gracePeriod=30 Oct 06 16:23:02 crc kubenswrapper[4763]: I1006 16:23:02.477191 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" containerName="glance-httpd" containerID="cri-o://5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c" gracePeriod=30 Oct 06 16:23:02 crc kubenswrapper[4763]: I1006 16:23:02.502102 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.502079107 podStartE2EDuration="3.502079107s" podCreationTimestamp="2025-10-06 16:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:23:02.495353956 +0000 UTC m=+5379.650646498" watchObservedRunningTime="2025-10-06 16:23:02.502079107 +0000 UTC m=+5379.657371619" Oct 06 16:23:02 crc kubenswrapper[4763]: I1006 16:23:02.526752 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8565f7649c-scn54" podStartSLOduration=3.5267303070000002 podStartE2EDuration="3.526730307s" podCreationTimestamp="2025-10-06 16:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:23:02.514498729 +0000 UTC m=+5379.669791251" watchObservedRunningTime="2025-10-06 16:23:02.526730307 +0000 UTC m=+5379.682022829" Oct 06 16:23:02 crc kubenswrapper[4763]: I1006 16:23:02.536525 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.536507089 podStartE2EDuration="3.536507089s" podCreationTimestamp="2025-10-06 16:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:23:02.535858862 +0000 UTC m=+5379.691151374" watchObservedRunningTime="2025-10-06 16:23:02.536507089 +0000 UTC m=+5379.691799601" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.221232 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.296541 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.390385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-logs\") pod \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.390499 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-scripts\") pod \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.390565 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-combined-ca-bundle\") pod \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.390599 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-httpd-run\") pod \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.390644 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lq6f\" (UniqueName: \"kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-kube-api-access-8lq6f\") pod \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.390692 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-ceph\") pod \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.390716 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-config-data\") pod \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\" (UID: \"d36c26e8-4ed8-4fa5-85ba-ac923385e411\") " Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.391176 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d36c26e8-4ed8-4fa5-85ba-ac923385e411" (UID: "d36c26e8-4ed8-4fa5-85ba-ac923385e411"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.392072 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-logs" (OuterVolumeSpecName: "logs") pod "d36c26e8-4ed8-4fa5-85ba-ac923385e411" (UID: "d36c26e8-4ed8-4fa5-85ba-ac923385e411"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.395989 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-kube-api-access-8lq6f" (OuterVolumeSpecName: "kube-api-access-8lq6f") pod "d36c26e8-4ed8-4fa5-85ba-ac923385e411" (UID: "d36c26e8-4ed8-4fa5-85ba-ac923385e411"). InnerVolumeSpecName "kube-api-access-8lq6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.396259 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-scripts" (OuterVolumeSpecName: "scripts") pod "d36c26e8-4ed8-4fa5-85ba-ac923385e411" (UID: "d36c26e8-4ed8-4fa5-85ba-ac923385e411"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.400725 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-ceph" (OuterVolumeSpecName: "ceph") pod "d36c26e8-4ed8-4fa5-85ba-ac923385e411" (UID: "d36c26e8-4ed8-4fa5-85ba-ac923385e411"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.419465 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d36c26e8-4ed8-4fa5-85ba-ac923385e411" (UID: "d36c26e8-4ed8-4fa5-85ba-ac923385e411"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.452870 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-config-data" (OuterVolumeSpecName: "config-data") pod "d36c26e8-4ed8-4fa5-85ba-ac923385e411" (UID: "d36c26e8-4ed8-4fa5-85ba-ac923385e411"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.485835 4763 generic.go:334] "Generic (PLEG): container finished" podID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" containerID="5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c" exitCode=0 Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.486489 4763 generic.go:334] "Generic (PLEG): container finished" podID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" containerID="44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e" exitCode=143 Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.485918 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d36c26e8-4ed8-4fa5-85ba-ac923385e411","Type":"ContainerDied","Data":"5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c"} Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.486669 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d36c26e8-4ed8-4fa5-85ba-ac923385e411","Type":"ContainerDied","Data":"44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e"} Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.486703 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d36c26e8-4ed8-4fa5-85ba-ac923385e411","Type":"ContainerDied","Data":"a4dc6e4b33416cd9f9a1d7079fe5700b8b087f7ffa150c71de0826293d412266"} Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.486701 4763 scope.go:117] "RemoveContainer" containerID="5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.485887 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.492659 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.492684 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.492843 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.493168 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.493194 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36c26e8-4ed8-4fa5-85ba-ac923385e411-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.493207 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d36c26e8-4ed8-4fa5-85ba-ac923385e411-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.493244 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lq6f\" (UniqueName: \"kubernetes.io/projected/d36c26e8-4ed8-4fa5-85ba-ac923385e411-kube-api-access-8lq6f\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.533129 4763 scope.go:117] "RemoveContainer" containerID="44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.536501 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.551559 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.566231 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:23:03 crc kubenswrapper[4763]: E1006 16:23:03.566712 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" containerName="glance-httpd" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.566735 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" containerName="glance-httpd" Oct 06 16:23:03 crc kubenswrapper[4763]: E1006 16:23:03.566769 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" containerName="glance-log" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.566778 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" containerName="glance-log" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.566968 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" containerName="glance-httpd" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.567000 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" containerName="glance-log" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.567885 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.572636 4763 scope.go:117] "RemoveContainer" containerID="5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.572895 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 16:23:03 crc kubenswrapper[4763]: E1006 16:23:03.573442 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c\": container with ID starting with 5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c not found: ID does not exist" containerID="5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.573480 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c"} err="failed to get container status \"5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c\": rpc error: code = NotFound desc = could not find container \"5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c\": container with ID starting with 5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c not found: ID does not exist" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.573511 4763 scope.go:117] "RemoveContainer" containerID="44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e" Oct 06 16:23:03 crc kubenswrapper[4763]: E1006 16:23:03.573987 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e\": container with ID starting with 44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e not found: ID does not exist" containerID="44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.574101 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e"} err="failed to get container status \"44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e\": rpc error: code = NotFound desc = could not find container \"44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e\": container with ID starting with 44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e not found: ID does not exist" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.574173 4763 scope.go:117] "RemoveContainer" containerID="5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.574895 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:23:03 crc kubenswrapper[4763]: E1006 16:23:03.575173 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.575488 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c"} err="failed to get container status \"5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c\": rpc error: code = NotFound desc = could not find container \"5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c\": container with ID starting with 5890fffc4ecc1850e619ac36dcd26305e06360b99efdf9b555be1ccf94eebc0c not found: ID does not exist" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.577399 4763 scope.go:117] "RemoveContainer" containerID="44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.581476 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e"} err="failed to get container status \"44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e\": rpc error: code = NotFound desc = could not find container \"44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e\": container with ID starting with 44e044bd81d1438701a3a7f63b48e028d73ef11a9d3c2fbf913c9685a25a346e not found: ID does not exist" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.594096 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.594322 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.594432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-scripts\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.594506 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-ceph\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.594708 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56s2b\" (UniqueName: \"kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-kube-api-access-56s2b\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.594825 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-config-data\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.594941 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-logs\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.596298 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36c26e8-4ed8-4fa5-85ba-ac923385e411" path="/var/lib/kubelet/pods/d36c26e8-4ed8-4fa5-85ba-ac923385e411/volumes" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.597131 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.695967 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-config-data\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.696036 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-logs\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.696088 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.696118 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.696140 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-scripts\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.696163 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-ceph\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.696214 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56s2b\" (UniqueName: \"kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-kube-api-access-56s2b\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.697203 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-logs\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.697341 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.701088 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-ceph\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.702356 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-scripts\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.702451 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.702952 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-config-data\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.715227 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56s2b\" (UniqueName: \"kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-kube-api-access-56s2b\") pod \"glance-default-external-api-0\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " pod="openstack/glance-default-external-api-0" Oct 06 16:23:03 crc kubenswrapper[4763]: I1006 16:23:03.886221 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 16:23:04 crc kubenswrapper[4763]: I1006 16:23:04.427424 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:23:04 crc kubenswrapper[4763]: I1006 16:23:04.497604 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4","Type":"ContainerStarted","Data":"bcf60ac450010b7501caed4d526a1db7cbb21645f55256cc4522594583a8722e"} Oct 06 16:23:04 crc kubenswrapper[4763]: I1006 16:23:04.499468 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8921620b-3b63-482b-b1c3-8c5def8928fb" containerName="glance-log" containerID="cri-o://ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a" gracePeriod=30 Oct 06 16:23:04 crc kubenswrapper[4763]: I1006 16:23:04.499531 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8921620b-3b63-482b-b1c3-8c5def8928fb" containerName="glance-httpd" containerID="cri-o://645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c" gracePeriod=30 Oct 06 16:23:04 crc kubenswrapper[4763]: I1006 16:23:04.984754 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.124425 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-logs\") pod \"8921620b-3b63-482b-b1c3-8c5def8928fb\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.124511 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-httpd-run\") pod \"8921620b-3b63-482b-b1c3-8c5def8928fb\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.124551 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rljmz\" (UniqueName: \"kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-kube-api-access-rljmz\") pod \"8921620b-3b63-482b-b1c3-8c5def8928fb\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.124651 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-config-data\") pod \"8921620b-3b63-482b-b1c3-8c5def8928fb\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.124687 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-ceph\") pod \"8921620b-3b63-482b-b1c3-8c5def8928fb\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.124719 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-combined-ca-bundle\") pod \"8921620b-3b63-482b-b1c3-8c5def8928fb\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.124752 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-scripts\") pod \"8921620b-3b63-482b-b1c3-8c5def8928fb\" (UID: \"8921620b-3b63-482b-b1c3-8c5def8928fb\") " Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.125475 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-logs" (OuterVolumeSpecName: "logs") pod "8921620b-3b63-482b-b1c3-8c5def8928fb" (UID: "8921620b-3b63-482b-b1c3-8c5def8928fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.125499 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8921620b-3b63-482b-b1c3-8c5def8928fb" (UID: "8921620b-3b63-482b-b1c3-8c5def8928fb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.128814 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-ceph" (OuterVolumeSpecName: "ceph") pod "8921620b-3b63-482b-b1c3-8c5def8928fb" (UID: "8921620b-3b63-482b-b1c3-8c5def8928fb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.129414 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-kube-api-access-rljmz" (OuterVolumeSpecName: "kube-api-access-rljmz") pod "8921620b-3b63-482b-b1c3-8c5def8928fb" (UID: "8921620b-3b63-482b-b1c3-8c5def8928fb"). InnerVolumeSpecName "kube-api-access-rljmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.136796 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-scripts" (OuterVolumeSpecName: "scripts") pod "8921620b-3b63-482b-b1c3-8c5def8928fb" (UID: "8921620b-3b63-482b-b1c3-8c5def8928fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.155742 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8921620b-3b63-482b-b1c3-8c5def8928fb" (UID: "8921620b-3b63-482b-b1c3-8c5def8928fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.190868 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-config-data" (OuterVolumeSpecName: "config-data") pod "8921620b-3b63-482b-b1c3-8c5def8928fb" (UID: "8921620b-3b63-482b-b1c3-8c5def8928fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.226704 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.226738 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8921620b-3b63-482b-b1c3-8c5def8928fb-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.226749 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rljmz\" (UniqueName: \"kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-kube-api-access-rljmz\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.226759 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.226767 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8921620b-3b63-482b-b1c3-8c5def8928fb-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.226776 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.226785 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8921620b-3b63-482b-b1c3-8c5def8928fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.510121 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4","Type":"ContainerStarted","Data":"406287c65797cd2ff0595db4b85e18a58994679e45b67a790eaa03a26cc43710"} Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.510444 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4","Type":"ContainerStarted","Data":"bd0a0b7bdfe3d5fe79478b17b35b7be0828ceec6f7ae720249005a4222a01523"} Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.513085 4763 generic.go:334] "Generic (PLEG): container finished" podID="8921620b-3b63-482b-b1c3-8c5def8928fb" containerID="645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c" exitCode=0 Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.513122 4763 generic.go:334] "Generic (PLEG): container finished" podID="8921620b-3b63-482b-b1c3-8c5def8928fb" containerID="ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a" exitCode=143 Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.513140 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.513148 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8921620b-3b63-482b-b1c3-8c5def8928fb","Type":"ContainerDied","Data":"645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c"} Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.513180 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8921620b-3b63-482b-b1c3-8c5def8928fb","Type":"ContainerDied","Data":"ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a"} Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.513194 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8921620b-3b63-482b-b1c3-8c5def8928fb","Type":"ContainerDied","Data":"12c47e8e64b48c8564962db23d2092e3fb8f451ef47d5de7a0ef99df6cad8ce6"} Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.513214 4763 scope.go:117] "RemoveContainer" containerID="645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.538536 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.538518521 podStartE2EDuration="2.538518521s" podCreationTimestamp="2025-10-06 16:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:23:05.531852532 +0000 UTC m=+5382.687145054" watchObservedRunningTime="2025-10-06 16:23:05.538518521 +0000 UTC m=+5382.693811033" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.548963 4763 scope.go:117] "RemoveContainer" containerID="ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.559163 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.567519 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.580071 4763 scope.go:117] "RemoveContainer" containerID="645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c" Oct 06 16:23:05 crc kubenswrapper[4763]: E1006 16:23:05.582088 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c\": container with ID starting with 645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c not found: ID does not exist" containerID="645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.582119 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c"} err="failed to get container status \"645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c\": rpc error: code = NotFound desc = could not find container \"645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c\": container with ID starting with 645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c not found: ID does not exist" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.582143 4763 scope.go:117] "RemoveContainer" containerID="ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a" Oct 06 16:23:05 crc kubenswrapper[4763]: E1006 16:23:05.590207 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a\": container with ID starting with ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a not found: ID does not exist" containerID="ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.590257 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a"} err="failed to get container status \"ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a\": rpc error: code = NotFound desc = could not find container \"ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a\": container with ID starting with ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a not found: ID does not exist" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.590288 4763 scope.go:117] "RemoveContainer" containerID="645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.591651 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c"} err="failed to get container status \"645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c\": rpc error: code = NotFound desc = could not find container \"645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c\": container with ID starting with 645eac04c1d9dc896c905eb7c3ce43c570300251c6a136f0243ca46f9bf4403c not found: ID does not exist" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.591710 4763 scope.go:117] "RemoveContainer" containerID="ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.591999 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a"} err="failed to get container status \"ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a\": rpc error: code = NotFound desc = could not find container \"ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a\": container with ID starting with ed8d35dff2a4ba0a55f4b4b1b53470da453d2673ed183710fc27ec3f1f13a15a not found: ID does not exist" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.606771 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8921620b-3b63-482b-b1c3-8c5def8928fb" path="/var/lib/kubelet/pods/8921620b-3b63-482b-b1c3-8c5def8928fb/volumes" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.607448 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:23:05 crc kubenswrapper[4763]: E1006 16:23:05.607802 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8921620b-3b63-482b-b1c3-8c5def8928fb" containerName="glance-httpd" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.607821 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8921620b-3b63-482b-b1c3-8c5def8928fb" containerName="glance-httpd" Oct 06 16:23:05 crc kubenswrapper[4763]: E1006 16:23:05.607845 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8921620b-3b63-482b-b1c3-8c5def8928fb" containerName="glance-log" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.607852 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8921620b-3b63-482b-b1c3-8c5def8928fb" containerName="glance-log" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.608002 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8921620b-3b63-482b-b1c3-8c5def8928fb" containerName="glance-log" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.608029 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8921620b-3b63-482b-b1c3-8c5def8928fb" containerName="glance-httpd" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.609450 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.609546 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.615222 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.633604 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.633665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.633724 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd8q7\" (UniqueName: \"kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-kube-api-access-cd8q7\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.633784 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.633800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.633813 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-logs\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.633847 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.734895 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd8q7\" (UniqueName: \"kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-kube-api-access-cd8q7\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.734967 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.734987 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.735001 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-logs\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.735028 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.735084 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.735102 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.735469 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.735825 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-logs\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.739451 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.740644 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.741834 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.743727 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.752929 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd8q7\" (UniqueName: \"kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-kube-api-access-cd8q7\") pod \"glance-default-internal-api-0\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:23:05 crc kubenswrapper[4763]: I1006 16:23:05.937260 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:06 crc kubenswrapper[4763]: I1006 16:23:06.456509 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:23:06 crc kubenswrapper[4763]: I1006 16:23:06.521995 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e75a2745-5aa0-47fe-8225-fadc3de8d130","Type":"ContainerStarted","Data":"f0cd1ba86a26e7b2219f111af653c3243d3bc454ec032020821d0a34dcc556bb"} Oct 06 16:23:07 crc kubenswrapper[4763]: I1006 16:23:07.533798 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e75a2745-5aa0-47fe-8225-fadc3de8d130","Type":"ContainerStarted","Data":"170a8f5ca5e2bdeb9ac21f65a289f63bf5745d0c14eb40adebf600d5d8afcca3"} Oct 06 16:23:07 crc kubenswrapper[4763]: I1006 16:23:07.534410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e75a2745-5aa0-47fe-8225-fadc3de8d130","Type":"ContainerStarted","Data":"fb17470f297a8aaaa2af829c7843e1eff33b230a533c5629817531f801d5eb13"} Oct 06 16:23:07 crc kubenswrapper[4763]: I1006 16:23:07.558523 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.558504562 podStartE2EDuration="2.558504562s" podCreationTimestamp="2025-10-06 16:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:23:07.550688393 +0000 UTC m=+5384.705980915" watchObservedRunningTime="2025-10-06 16:23:07.558504562 +0000 UTC m=+5384.713797074" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.135863 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.213828 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-6727t"] Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.214260 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" podUID="6e4bac65-de72-45e5-8dfe-495462ccc48c" containerName="dnsmasq-dns" containerID="cri-o://f51f9d060969c20bd21e3415bd1dae553b3c25613b8d2b82a29e2ec8e552372b" gracePeriod=10 Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.567781 4763 generic.go:334] "Generic (PLEG): container finished" podID="6e4bac65-de72-45e5-8dfe-495462ccc48c" containerID="f51f9d060969c20bd21e3415bd1dae553b3c25613b8d2b82a29e2ec8e552372b" exitCode=0 Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.568061 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" event={"ID":"6e4bac65-de72-45e5-8dfe-495462ccc48c","Type":"ContainerDied","Data":"f51f9d060969c20bd21e3415bd1dae553b3c25613b8d2b82a29e2ec8e552372b"} Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.683524 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.721627 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-dns-svc\") pod \"6e4bac65-de72-45e5-8dfe-495462ccc48c\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.721708 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-config\") pod \"6e4bac65-de72-45e5-8dfe-495462ccc48c\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.721765 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-sb\") pod \"6e4bac65-de72-45e5-8dfe-495462ccc48c\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.721823 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-nb\") pod \"6e4bac65-de72-45e5-8dfe-495462ccc48c\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.721955 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbcft\" (UniqueName: \"kubernetes.io/projected/6e4bac65-de72-45e5-8dfe-495462ccc48c-kube-api-access-xbcft\") pod \"6e4bac65-de72-45e5-8dfe-495462ccc48c\" (UID: \"6e4bac65-de72-45e5-8dfe-495462ccc48c\") " Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.729059 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e4bac65-de72-45e5-8dfe-495462ccc48c-kube-api-access-xbcft" (OuterVolumeSpecName: "kube-api-access-xbcft") pod "6e4bac65-de72-45e5-8dfe-495462ccc48c" (UID: "6e4bac65-de72-45e5-8dfe-495462ccc48c"). InnerVolumeSpecName "kube-api-access-xbcft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.765343 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-config" (OuterVolumeSpecName: "config") pod "6e4bac65-de72-45e5-8dfe-495462ccc48c" (UID: "6e4bac65-de72-45e5-8dfe-495462ccc48c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.772290 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e4bac65-de72-45e5-8dfe-495462ccc48c" (UID: "6e4bac65-de72-45e5-8dfe-495462ccc48c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.774037 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e4bac65-de72-45e5-8dfe-495462ccc48c" (UID: "6e4bac65-de72-45e5-8dfe-495462ccc48c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.775132 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e4bac65-de72-45e5-8dfe-495462ccc48c" (UID: "6e4bac65-de72-45e5-8dfe-495462ccc48c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.823153 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbcft\" (UniqueName: \"kubernetes.io/projected/6e4bac65-de72-45e5-8dfe-495462ccc48c-kube-api-access-xbcft\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.823189 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.823200 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.823210 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:10 crc kubenswrapper[4763]: I1006 16:23:10.823218 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e4bac65-de72-45e5-8dfe-495462ccc48c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:11 crc kubenswrapper[4763]: I1006 16:23:11.602121 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" Oct 06 16:23:11 crc kubenswrapper[4763]: I1006 16:23:11.621578 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d77d5bf-6727t" event={"ID":"6e4bac65-de72-45e5-8dfe-495462ccc48c","Type":"ContainerDied","Data":"db1a79e3d398848d31708fcaa5272b6bae42bc2b6f3788a081dc8253e3ced807"} Oct 06 16:23:11 crc kubenswrapper[4763]: I1006 16:23:11.621764 4763 scope.go:117] "RemoveContainer" containerID="f51f9d060969c20bd21e3415bd1dae553b3c25613b8d2b82a29e2ec8e552372b" Oct 06 16:23:11 crc kubenswrapper[4763]: I1006 16:23:11.668552 4763 scope.go:117] "RemoveContainer" containerID="2703d244d3da143b8b64b50ede21d1e00a3099322abdc7a409d8dc9a315a375a" Oct 06 16:23:11 crc kubenswrapper[4763]: I1006 16:23:11.671121 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-6727t"] Oct 06 16:23:11 crc kubenswrapper[4763]: I1006 16:23:11.682804 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-6727t"] Oct 06 16:23:13 crc kubenswrapper[4763]: I1006 16:23:13.591845 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e4bac65-de72-45e5-8dfe-495462ccc48c" path="/var/lib/kubelet/pods/6e4bac65-de72-45e5-8dfe-495462ccc48c/volumes" Oct 06 16:23:13 crc kubenswrapper[4763]: I1006 16:23:13.887355 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 16:23:13 crc kubenswrapper[4763]: I1006 16:23:13.887695 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 16:23:13 crc kubenswrapper[4763]: I1006 16:23:13.938628 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 16:23:13 crc kubenswrapper[4763]: I1006 16:23:13.962340 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 16:23:14 crc kubenswrapper[4763]: I1006 16:23:14.639523 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 16:23:14 crc kubenswrapper[4763]: I1006 16:23:14.639583 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 16:23:15 crc kubenswrapper[4763]: I1006 16:23:15.938166 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:15 crc kubenswrapper[4763]: I1006 16:23:15.939414 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:15 crc kubenswrapper[4763]: I1006 16:23:15.980871 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:16 crc kubenswrapper[4763]: I1006 16:23:16.023847 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:16 crc kubenswrapper[4763]: I1006 16:23:16.577387 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 16:23:16 crc kubenswrapper[4763]: I1006 16:23:16.644593 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 16:23:16 crc kubenswrapper[4763]: I1006 16:23:16.666153 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:16 crc kubenswrapper[4763]: I1006 16:23:16.666439 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:17 crc kubenswrapper[4763]: I1006 16:23:17.582016 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:23:17 crc kubenswrapper[4763]: E1006 16:23:17.582246 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:23:18 crc kubenswrapper[4763]: I1006 16:23:18.537400 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:18 crc kubenswrapper[4763]: I1006 16:23:18.681856 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.086381 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5jv9k"] Oct 06 16:23:27 crc kubenswrapper[4763]: E1006 16:23:27.087460 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4bac65-de72-45e5-8dfe-495462ccc48c" containerName="dnsmasq-dns" Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.087496 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4bac65-de72-45e5-8dfe-495462ccc48c" containerName="dnsmasq-dns" Oct 06 16:23:27 crc kubenswrapper[4763]: E1006 16:23:27.087564 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4bac65-de72-45e5-8dfe-495462ccc48c" containerName="init" Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.087581 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4bac65-de72-45e5-8dfe-495462ccc48c" containerName="init" Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.088046 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4bac65-de72-45e5-8dfe-495462ccc48c" containerName="dnsmasq-dns" Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.089017 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5jv9k" Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.104996 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5jv9k"] Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.248184 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqtkj\" (UniqueName: \"kubernetes.io/projected/f8429c81-df56-4dd6-a297-85f6ec56a2ed-kube-api-access-dqtkj\") pod \"placement-db-create-5jv9k\" (UID: \"f8429c81-df56-4dd6-a297-85f6ec56a2ed\") " pod="openstack/placement-db-create-5jv9k" Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.349743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqtkj\" (UniqueName: \"kubernetes.io/projected/f8429c81-df56-4dd6-a297-85f6ec56a2ed-kube-api-access-dqtkj\") pod \"placement-db-create-5jv9k\" (UID: \"f8429c81-df56-4dd6-a297-85f6ec56a2ed\") " pod="openstack/placement-db-create-5jv9k" Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.369889 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqtkj\" (UniqueName: \"kubernetes.io/projected/f8429c81-df56-4dd6-a297-85f6ec56a2ed-kube-api-access-dqtkj\") pod \"placement-db-create-5jv9k\" (UID: \"f8429c81-df56-4dd6-a297-85f6ec56a2ed\") " pod="openstack/placement-db-create-5jv9k" Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.423995 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5jv9k" Oct 06 16:23:27 crc kubenswrapper[4763]: I1006 16:23:27.855155 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5jv9k"] Oct 06 16:23:27 crc kubenswrapper[4763]: W1006 16:23:27.857282 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8429c81_df56_4dd6_a297_85f6ec56a2ed.slice/crio-2e417a4802b0f6f9e4f192169b80462856fa05305ae4faabd133740b3542b746 WatchSource:0}: Error finding container 2e417a4802b0f6f9e4f192169b80462856fa05305ae4faabd133740b3542b746: Status 404 returned error can't find the container with id 2e417a4802b0f6f9e4f192169b80462856fa05305ae4faabd133740b3542b746 Oct 06 16:23:28 crc kubenswrapper[4763]: I1006 16:23:28.810032 4763 generic.go:334] "Generic (PLEG): container finished" podID="f8429c81-df56-4dd6-a297-85f6ec56a2ed" containerID="9b7a16253acbc46f0bad8196bab9cb8208fb7a6a90c91cdb5b6489255f739db5" exitCode=0 Oct 06 16:23:28 crc kubenswrapper[4763]: I1006 16:23:28.810118 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5jv9k" event={"ID":"f8429c81-df56-4dd6-a297-85f6ec56a2ed","Type":"ContainerDied","Data":"9b7a16253acbc46f0bad8196bab9cb8208fb7a6a90c91cdb5b6489255f739db5"} Oct 06 16:23:28 crc kubenswrapper[4763]: I1006 16:23:28.810185 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5jv9k" event={"ID":"f8429c81-df56-4dd6-a297-85f6ec56a2ed","Type":"ContainerStarted","Data":"2e417a4802b0f6f9e4f192169b80462856fa05305ae4faabd133740b3542b746"} Oct 06 16:23:29 crc kubenswrapper[4763]: I1006 16:23:29.575386 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:23:29 crc kubenswrapper[4763]: E1006 16:23:29.576113 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:23:29 crc kubenswrapper[4763]: I1006 16:23:29.649173 4763 scope.go:117] "RemoveContainer" containerID="f1f4d4b3e1cef47791ec883138f40cf366b55f78afd534d24b615083046aca47" Oct 06 16:23:30 crc kubenswrapper[4763]: I1006 16:23:30.164773 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5jv9k" Oct 06 16:23:30 crc kubenswrapper[4763]: I1006 16:23:30.313029 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqtkj\" (UniqueName: \"kubernetes.io/projected/f8429c81-df56-4dd6-a297-85f6ec56a2ed-kube-api-access-dqtkj\") pod \"f8429c81-df56-4dd6-a297-85f6ec56a2ed\" (UID: \"f8429c81-df56-4dd6-a297-85f6ec56a2ed\") " Oct 06 16:23:30 crc kubenswrapper[4763]: I1006 16:23:30.318808 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8429c81-df56-4dd6-a297-85f6ec56a2ed-kube-api-access-dqtkj" (OuterVolumeSpecName: "kube-api-access-dqtkj") pod "f8429c81-df56-4dd6-a297-85f6ec56a2ed" (UID: "f8429c81-df56-4dd6-a297-85f6ec56a2ed"). InnerVolumeSpecName "kube-api-access-dqtkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:23:30 crc kubenswrapper[4763]: I1006 16:23:30.416326 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqtkj\" (UniqueName: \"kubernetes.io/projected/f8429c81-df56-4dd6-a297-85f6ec56a2ed-kube-api-access-dqtkj\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:30 crc kubenswrapper[4763]: I1006 16:23:30.834304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5jv9k" event={"ID":"f8429c81-df56-4dd6-a297-85f6ec56a2ed","Type":"ContainerDied","Data":"2e417a4802b0f6f9e4f192169b80462856fa05305ae4faabd133740b3542b746"} Oct 06 16:23:30 crc kubenswrapper[4763]: I1006 16:23:30.834379 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e417a4802b0f6f9e4f192169b80462856fa05305ae4faabd133740b3542b746" Oct 06 16:23:30 crc kubenswrapper[4763]: I1006 16:23:30.834393 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5jv9k" Oct 06 16:23:37 crc kubenswrapper[4763]: I1006 16:23:37.225300 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8826-account-create-kq9kx"] Oct 06 16:23:37 crc kubenswrapper[4763]: E1006 16:23:37.226917 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8429c81-df56-4dd6-a297-85f6ec56a2ed" containerName="mariadb-database-create" Oct 06 16:23:37 crc kubenswrapper[4763]: I1006 16:23:37.226997 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8429c81-df56-4dd6-a297-85f6ec56a2ed" containerName="mariadb-database-create" Oct 06 16:23:37 crc kubenswrapper[4763]: I1006 16:23:37.227231 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8429c81-df56-4dd6-a297-85f6ec56a2ed" containerName="mariadb-database-create" Oct 06 16:23:37 crc kubenswrapper[4763]: I1006 16:23:37.227874 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8826-account-create-kq9kx" Oct 06 16:23:37 crc kubenswrapper[4763]: I1006 16:23:37.230379 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 06 16:23:37 crc kubenswrapper[4763]: I1006 16:23:37.250596 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8826-account-create-kq9kx"] Oct 06 16:23:37 crc kubenswrapper[4763]: I1006 16:23:37.264361 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvqbk\" (UniqueName: \"kubernetes.io/projected/98cd77c1-a82d-4262-a70d-9bb52cac0667-kube-api-access-jvqbk\") pod \"placement-8826-account-create-kq9kx\" (UID: \"98cd77c1-a82d-4262-a70d-9bb52cac0667\") " pod="openstack/placement-8826-account-create-kq9kx" Oct 06 16:23:37 crc kubenswrapper[4763]: I1006 16:23:37.365661 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvqbk\" (UniqueName: \"kubernetes.io/projected/98cd77c1-a82d-4262-a70d-9bb52cac0667-kube-api-access-jvqbk\") pod \"placement-8826-account-create-kq9kx\" (UID: \"98cd77c1-a82d-4262-a70d-9bb52cac0667\") " pod="openstack/placement-8826-account-create-kq9kx" Oct 06 16:23:37 crc kubenswrapper[4763]: I1006 16:23:37.383006 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvqbk\" (UniqueName: \"kubernetes.io/projected/98cd77c1-a82d-4262-a70d-9bb52cac0667-kube-api-access-jvqbk\") pod \"placement-8826-account-create-kq9kx\" (UID: \"98cd77c1-a82d-4262-a70d-9bb52cac0667\") " pod="openstack/placement-8826-account-create-kq9kx" Oct 06 16:23:37 crc kubenswrapper[4763]: I1006 16:23:37.560255 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8826-account-create-kq9kx" Oct 06 16:23:38 crc kubenswrapper[4763]: I1006 16:23:38.014744 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8826-account-create-kq9kx"] Oct 06 16:23:38 crc kubenswrapper[4763]: I1006 16:23:38.944019 4763 generic.go:334] "Generic (PLEG): container finished" podID="98cd77c1-a82d-4262-a70d-9bb52cac0667" containerID="e4e808278d1d9bd74d61ccbced436c2a8868e7f6534c6ddd0253f1f07c6861aa" exitCode=0 Oct 06 16:23:38 crc kubenswrapper[4763]: I1006 16:23:38.944372 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8826-account-create-kq9kx" event={"ID":"98cd77c1-a82d-4262-a70d-9bb52cac0667","Type":"ContainerDied","Data":"e4e808278d1d9bd74d61ccbced436c2a8868e7f6534c6ddd0253f1f07c6861aa"} Oct 06 16:23:38 crc kubenswrapper[4763]: I1006 16:23:38.944431 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8826-account-create-kq9kx" event={"ID":"98cd77c1-a82d-4262-a70d-9bb52cac0667","Type":"ContainerStarted","Data":"f53c0061b082815acd43a83188b12f65aec0779c8b8102ea02edf11e5d8e5f4a"} Oct 06 16:23:40 crc kubenswrapper[4763]: I1006 16:23:40.288918 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8826-account-create-kq9kx" Oct 06 16:23:40 crc kubenswrapper[4763]: I1006 16:23:40.323455 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvqbk\" (UniqueName: \"kubernetes.io/projected/98cd77c1-a82d-4262-a70d-9bb52cac0667-kube-api-access-jvqbk\") pod \"98cd77c1-a82d-4262-a70d-9bb52cac0667\" (UID: \"98cd77c1-a82d-4262-a70d-9bb52cac0667\") " Oct 06 16:23:40 crc kubenswrapper[4763]: I1006 16:23:40.334291 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98cd77c1-a82d-4262-a70d-9bb52cac0667-kube-api-access-jvqbk" (OuterVolumeSpecName: "kube-api-access-jvqbk") pod "98cd77c1-a82d-4262-a70d-9bb52cac0667" (UID: "98cd77c1-a82d-4262-a70d-9bb52cac0667"). InnerVolumeSpecName "kube-api-access-jvqbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:23:40 crc kubenswrapper[4763]: I1006 16:23:40.424982 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvqbk\" (UniqueName: \"kubernetes.io/projected/98cd77c1-a82d-4262-a70d-9bb52cac0667-kube-api-access-jvqbk\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:40 crc kubenswrapper[4763]: I1006 16:23:40.961357 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8826-account-create-kq9kx" event={"ID":"98cd77c1-a82d-4262-a70d-9bb52cac0667","Type":"ContainerDied","Data":"f53c0061b082815acd43a83188b12f65aec0779c8b8102ea02edf11e5d8e5f4a"} Oct 06 16:23:40 crc kubenswrapper[4763]: I1006 16:23:40.961411 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f53c0061b082815acd43a83188b12f65aec0779c8b8102ea02edf11e5d8e5f4a" Oct 06 16:23:40 crc kubenswrapper[4763]: I1006 16:23:40.961429 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8826-account-create-kq9kx" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.426256 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-7cf27"] Oct 06 16:23:42 crc kubenswrapper[4763]: E1006 16:23:42.427094 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cd77c1-a82d-4262-a70d-9bb52cac0667" containerName="mariadb-account-create" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.427117 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cd77c1-a82d-4262-a70d-9bb52cac0667" containerName="mariadb-account-create" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.427280 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="98cd77c1-a82d-4262-a70d-9bb52cac0667" containerName="mariadb-account-create" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.434177 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.443826 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-7cf27"] Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.459526 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxxll\" (UniqueName: \"kubernetes.io/projected/158506cf-4e2c-442b-9695-2764c2d9e7e2-kube-api-access-xxxll\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.459637 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-nb\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.459663 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-config\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.459694 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-sb\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.459747 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-dns-svc\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.485027 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-v2jhg"] Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.486397 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.488417 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9cf7b" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.488733 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.489791 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.493564 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v2jhg"] Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.561062 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-sb\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.561117 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59554\" (UniqueName: \"kubernetes.io/projected/88bc1ed1-051f-4afb-9ae1-aba8065452e2-kube-api-access-59554\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.561150 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-config-data\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.561213 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-dns-svc\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.561251 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxxll\" (UniqueName: \"kubernetes.io/projected/158506cf-4e2c-442b-9695-2764c2d9e7e2-kube-api-access-xxxll\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.561284 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88bc1ed1-051f-4afb-9ae1-aba8065452e2-logs\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.561348 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-combined-ca-bundle\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.561377 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-nb\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.561400 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-config\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.561433 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-scripts\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.562304 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-dns-svc\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.563576 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-nb\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.564174 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-sb\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.564254 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-config\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.575650 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:23:42 crc kubenswrapper[4763]: E1006 16:23:42.575956 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.583373 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxxll\" (UniqueName: \"kubernetes.io/projected/158506cf-4e2c-442b-9695-2764c2d9e7e2-kube-api-access-xxxll\") pod \"dnsmasq-dns-85c649d7bf-7cf27\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.663078 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-config-data\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.664523 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88bc1ed1-051f-4afb-9ae1-aba8065452e2-logs\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.664865 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88bc1ed1-051f-4afb-9ae1-aba8065452e2-logs\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.665035 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-combined-ca-bundle\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.665091 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-scripts\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.665123 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59554\" (UniqueName: \"kubernetes.io/projected/88bc1ed1-051f-4afb-9ae1-aba8065452e2-kube-api-access-59554\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.668481 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-config-data\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.672256 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-scripts\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.672429 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-combined-ca-bundle\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.680644 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59554\" (UniqueName: \"kubernetes.io/projected/88bc1ed1-051f-4afb-9ae1-aba8065452e2-kube-api-access-59554\") pod \"placement-db-sync-v2jhg\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.756776 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:42 crc kubenswrapper[4763]: I1006 16:23:42.848330 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:43 crc kubenswrapper[4763]: I1006 16:23:43.184346 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-7cf27"] Oct 06 16:23:43 crc kubenswrapper[4763]: I1006 16:23:43.333371 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v2jhg"] Oct 06 16:23:43 crc kubenswrapper[4763]: W1006 16:23:43.342439 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88bc1ed1_051f_4afb_9ae1_aba8065452e2.slice/crio-bfe0c1a7a1d7e2ba23040a2c04c4137402ea5979c28b6901d2a19a6b3291786f WatchSource:0}: Error finding container bfe0c1a7a1d7e2ba23040a2c04c4137402ea5979c28b6901d2a19a6b3291786f: Status 404 returned error can't find the container with id bfe0c1a7a1d7e2ba23040a2c04c4137402ea5979c28b6901d2a19a6b3291786f Oct 06 16:23:43 crc kubenswrapper[4763]: I1006 16:23:43.993367 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v2jhg" event={"ID":"88bc1ed1-051f-4afb-9ae1-aba8065452e2","Type":"ContainerStarted","Data":"a7d15db469d172b6ba5d469e63d2504978cb1d0e40b57b9712c2a47b0b54b2bb"} Oct 06 16:23:43 crc kubenswrapper[4763]: I1006 16:23:43.993716 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v2jhg" event={"ID":"88bc1ed1-051f-4afb-9ae1-aba8065452e2","Type":"ContainerStarted","Data":"bfe0c1a7a1d7e2ba23040a2c04c4137402ea5979c28b6901d2a19a6b3291786f"} Oct 06 16:23:43 crc kubenswrapper[4763]: I1006 16:23:43.995114 4763 generic.go:334] "Generic (PLEG): container finished" podID="158506cf-4e2c-442b-9695-2764c2d9e7e2" containerID="e27e159f45dcae7b83ee1f56ba01d551188c3aa5b8153263501f760d0adca987" exitCode=0 Oct 06 16:23:43 crc kubenswrapper[4763]: I1006 16:23:43.995164 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" event={"ID":"158506cf-4e2c-442b-9695-2764c2d9e7e2","Type":"ContainerDied","Data":"e27e159f45dcae7b83ee1f56ba01d551188c3aa5b8153263501f760d0adca987"} Oct 06 16:23:43 crc kubenswrapper[4763]: I1006 16:23:43.995192 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" event={"ID":"158506cf-4e2c-442b-9695-2764c2d9e7e2","Type":"ContainerStarted","Data":"77da69b93cefc84e9b281c49f4d21984454cb1cf38702104f520c6269670ff99"} Oct 06 16:23:44 crc kubenswrapper[4763]: I1006 16:23:44.013584 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-v2jhg" podStartSLOduration=2.013569468 podStartE2EDuration="2.013569468s" podCreationTimestamp="2025-10-06 16:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:23:44.007886006 +0000 UTC m=+5421.163178518" watchObservedRunningTime="2025-10-06 16:23:44.013569468 +0000 UTC m=+5421.168861980" Oct 06 16:23:45 crc kubenswrapper[4763]: I1006 16:23:45.007280 4763 generic.go:334] "Generic (PLEG): container finished" podID="88bc1ed1-051f-4afb-9ae1-aba8065452e2" containerID="a7d15db469d172b6ba5d469e63d2504978cb1d0e40b57b9712c2a47b0b54b2bb" exitCode=0 Oct 06 16:23:45 crc kubenswrapper[4763]: I1006 16:23:45.007352 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v2jhg" event={"ID":"88bc1ed1-051f-4afb-9ae1-aba8065452e2","Type":"ContainerDied","Data":"a7d15db469d172b6ba5d469e63d2504978cb1d0e40b57b9712c2a47b0b54b2bb"} Oct 06 16:23:45 crc kubenswrapper[4763]: I1006 16:23:45.010460 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" event={"ID":"158506cf-4e2c-442b-9695-2764c2d9e7e2","Type":"ContainerStarted","Data":"584f180fbc1c5e63814bc310c17247662ac6e5f06a577ce2b87c06cea316720c"} Oct 06 16:23:45 crc kubenswrapper[4763]: I1006 16:23:45.010582 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:45 crc kubenswrapper[4763]: I1006 16:23:45.043266 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" podStartSLOduration=3.043251956 podStartE2EDuration="3.043251956s" podCreationTimestamp="2025-10-06 16:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:23:45.040078951 +0000 UTC m=+5422.195371463" watchObservedRunningTime="2025-10-06 16:23:45.043251956 +0000 UTC m=+5422.198544468" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.361677 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.531849 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-scripts\") pod \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.531932 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88bc1ed1-051f-4afb-9ae1-aba8065452e2-logs\") pod \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.531997 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-config-data\") pod \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.532043 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-combined-ca-bundle\") pod \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.532190 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59554\" (UniqueName: \"kubernetes.io/projected/88bc1ed1-051f-4afb-9ae1-aba8065452e2-kube-api-access-59554\") pod \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\" (UID: \"88bc1ed1-051f-4afb-9ae1-aba8065452e2\") " Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.533231 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88bc1ed1-051f-4afb-9ae1-aba8065452e2-logs" (OuterVolumeSpecName: "logs") pod "88bc1ed1-051f-4afb-9ae1-aba8065452e2" (UID: "88bc1ed1-051f-4afb-9ae1-aba8065452e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.537271 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88bc1ed1-051f-4afb-9ae1-aba8065452e2-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.549852 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88bc1ed1-051f-4afb-9ae1-aba8065452e2-kube-api-access-59554" (OuterVolumeSpecName: "kube-api-access-59554") pod "88bc1ed1-051f-4afb-9ae1-aba8065452e2" (UID: "88bc1ed1-051f-4afb-9ae1-aba8065452e2"). InnerVolumeSpecName "kube-api-access-59554". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.550000 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-scripts" (OuterVolumeSpecName: "scripts") pod "88bc1ed1-051f-4afb-9ae1-aba8065452e2" (UID: "88bc1ed1-051f-4afb-9ae1-aba8065452e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.558093 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88bc1ed1-051f-4afb-9ae1-aba8065452e2" (UID: "88bc1ed1-051f-4afb-9ae1-aba8065452e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.573334 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-config-data" (OuterVolumeSpecName: "config-data") pod "88bc1ed1-051f-4afb-9ae1-aba8065452e2" (UID: "88bc1ed1-051f-4afb-9ae1-aba8065452e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.638413 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59554\" (UniqueName: \"kubernetes.io/projected/88bc1ed1-051f-4afb-9ae1-aba8065452e2-kube-api-access-59554\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.638637 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.638650 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.638662 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bc1ed1-051f-4afb-9ae1-aba8065452e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.712919 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fc4965fb-jjbpd"] Oct 06 16:23:46 crc kubenswrapper[4763]: E1006 16:23:46.713321 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88bc1ed1-051f-4afb-9ae1-aba8065452e2" containerName="placement-db-sync" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.713338 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="88bc1ed1-051f-4afb-9ae1-aba8065452e2" containerName="placement-db-sync" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.713498 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="88bc1ed1-051f-4afb-9ae1-aba8065452e2" containerName="placement-db-sync" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.714450 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.733492 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc4965fb-jjbpd"] Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.841303 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5831c20-f18f-4e98-9001-80e29602b3a1-scripts\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.841401 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5831c20-f18f-4e98-9001-80e29602b3a1-combined-ca-bundle\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.842250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5831c20-f18f-4e98-9001-80e29602b3a1-logs\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.842352 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5831c20-f18f-4e98-9001-80e29602b3a1-config-data\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.842594 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j4q7\" (UniqueName: \"kubernetes.io/projected/c5831c20-f18f-4e98-9001-80e29602b3a1-kube-api-access-2j4q7\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.943990 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j4q7\" (UniqueName: \"kubernetes.io/projected/c5831c20-f18f-4e98-9001-80e29602b3a1-kube-api-access-2j4q7\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.944071 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5831c20-f18f-4e98-9001-80e29602b3a1-scripts\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.944136 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5831c20-f18f-4e98-9001-80e29602b3a1-combined-ca-bundle\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.944179 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5831c20-f18f-4e98-9001-80e29602b3a1-logs\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.944212 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5831c20-f18f-4e98-9001-80e29602b3a1-config-data\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.945023 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5831c20-f18f-4e98-9001-80e29602b3a1-logs\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.948544 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5831c20-f18f-4e98-9001-80e29602b3a1-combined-ca-bundle\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.948688 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5831c20-f18f-4e98-9001-80e29602b3a1-config-data\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.956273 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5831c20-f18f-4e98-9001-80e29602b3a1-scripts\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:46 crc kubenswrapper[4763]: I1006 16:23:46.962312 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j4q7\" (UniqueName: \"kubernetes.io/projected/c5831c20-f18f-4e98-9001-80e29602b3a1-kube-api-access-2j4q7\") pod \"placement-fc4965fb-jjbpd\" (UID: \"c5831c20-f18f-4e98-9001-80e29602b3a1\") " pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:47 crc kubenswrapper[4763]: I1006 16:23:47.025302 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v2jhg" event={"ID":"88bc1ed1-051f-4afb-9ae1-aba8065452e2","Type":"ContainerDied","Data":"bfe0c1a7a1d7e2ba23040a2c04c4137402ea5979c28b6901d2a19a6b3291786f"} Oct 06 16:23:47 crc kubenswrapper[4763]: I1006 16:23:47.025341 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfe0c1a7a1d7e2ba23040a2c04c4137402ea5979c28b6901d2a19a6b3291786f" Oct 06 16:23:47 crc kubenswrapper[4763]: I1006 16:23:47.025320 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v2jhg" Oct 06 16:23:47 crc kubenswrapper[4763]: I1006 16:23:47.034368 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:47 crc kubenswrapper[4763]: I1006 16:23:47.491836 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc4965fb-jjbpd"] Oct 06 16:23:48 crc kubenswrapper[4763]: I1006 16:23:48.042808 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc4965fb-jjbpd" event={"ID":"c5831c20-f18f-4e98-9001-80e29602b3a1","Type":"ContainerStarted","Data":"17bb0060c7f72f796a7116dec48e5bdd0f05802706034603b16b454a229252af"} Oct 06 16:23:48 crc kubenswrapper[4763]: I1006 16:23:48.043357 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc4965fb-jjbpd" event={"ID":"c5831c20-f18f-4e98-9001-80e29602b3a1","Type":"ContainerStarted","Data":"ddc1fa27646a39aee35d83e9d8fd5f876bd5f1981e58963fb1a57e901b4e1adc"} Oct 06 16:23:48 crc kubenswrapper[4763]: I1006 16:23:48.045305 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:48 crc kubenswrapper[4763]: I1006 16:23:48.045422 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc4965fb-jjbpd" event={"ID":"c5831c20-f18f-4e98-9001-80e29602b3a1","Type":"ContainerStarted","Data":"94ab266894e41a323760c33c5735a97f165d29b4e17abee5d93ea21991911b3b"} Oct 06 16:23:48 crc kubenswrapper[4763]: I1006 16:23:48.083537 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-fc4965fb-jjbpd" podStartSLOduration=2.083510173 podStartE2EDuration="2.083510173s" podCreationTimestamp="2025-10-06 16:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:23:48.073204287 +0000 UTC m=+5425.228496809" watchObservedRunningTime="2025-10-06 16:23:48.083510173 +0000 UTC m=+5425.238802685" Oct 06 16:23:49 crc kubenswrapper[4763]: I1006 16:23:49.050224 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:23:52 crc kubenswrapper[4763]: I1006 16:23:52.758812 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:23:52 crc kubenswrapper[4763]: I1006 16:23:52.833891 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-scn54"] Oct 06 16:23:52 crc kubenswrapper[4763]: I1006 16:23:52.834380 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8565f7649c-scn54" podUID="aef41853-eb12-41bf-ae34-2f46a8538650" containerName="dnsmasq-dns" containerID="cri-o://da8961f38f0f7573e8ab3af0f4b8877edeedd4fb20bc0b159b1313280a2bdd79" gracePeriod=10 Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.103348 4763 generic.go:334] "Generic (PLEG): container finished" podID="aef41853-eb12-41bf-ae34-2f46a8538650" containerID="da8961f38f0f7573e8ab3af0f4b8877edeedd4fb20bc0b159b1313280a2bdd79" exitCode=0 Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.103435 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8565f7649c-scn54" event={"ID":"aef41853-eb12-41bf-ae34-2f46a8538650","Type":"ContainerDied","Data":"da8961f38f0f7573e8ab3af0f4b8877edeedd4fb20bc0b159b1313280a2bdd79"} Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.284473 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.473267 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-dns-svc\") pod \"aef41853-eb12-41bf-ae34-2f46a8538650\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.473315 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-nb\") pod \"aef41853-eb12-41bf-ae34-2f46a8538650\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.473379 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-config\") pod \"aef41853-eb12-41bf-ae34-2f46a8538650\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.473468 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj7ng\" (UniqueName: \"kubernetes.io/projected/aef41853-eb12-41bf-ae34-2f46a8538650-kube-api-access-qj7ng\") pod \"aef41853-eb12-41bf-ae34-2f46a8538650\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.473498 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-sb\") pod \"aef41853-eb12-41bf-ae34-2f46a8538650\" (UID: \"aef41853-eb12-41bf-ae34-2f46a8538650\") " Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.483021 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef41853-eb12-41bf-ae34-2f46a8538650-kube-api-access-qj7ng" (OuterVolumeSpecName: "kube-api-access-qj7ng") pod "aef41853-eb12-41bf-ae34-2f46a8538650" (UID: "aef41853-eb12-41bf-ae34-2f46a8538650"). InnerVolumeSpecName "kube-api-access-qj7ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.522178 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-config" (OuterVolumeSpecName: "config") pod "aef41853-eb12-41bf-ae34-2f46a8538650" (UID: "aef41853-eb12-41bf-ae34-2f46a8538650"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.524342 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aef41853-eb12-41bf-ae34-2f46a8538650" (UID: "aef41853-eb12-41bf-ae34-2f46a8538650"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.525531 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aef41853-eb12-41bf-ae34-2f46a8538650" (UID: "aef41853-eb12-41bf-ae34-2f46a8538650"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.530443 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aef41853-eb12-41bf-ae34-2f46a8538650" (UID: "aef41853-eb12-41bf-ae34-2f46a8538650"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.578688 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.578723 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.578733 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.578744 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj7ng\" (UniqueName: \"kubernetes.io/projected/aef41853-eb12-41bf-ae34-2f46a8538650-kube-api-access-qj7ng\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:53 crc kubenswrapper[4763]: I1006 16:23:53.578752 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aef41853-eb12-41bf-ae34-2f46a8538650-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:23:54 crc kubenswrapper[4763]: I1006 16:23:54.114512 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8565f7649c-scn54" event={"ID":"aef41853-eb12-41bf-ae34-2f46a8538650","Type":"ContainerDied","Data":"887421a66f7f96e1cdd789672c74b90dd22143488ccb871688a41768855a666d"} Oct 06 16:23:54 crc kubenswrapper[4763]: I1006 16:23:54.114884 4763 scope.go:117] "RemoveContainer" containerID="da8961f38f0f7573e8ab3af0f4b8877edeedd4fb20bc0b159b1313280a2bdd79" Oct 06 16:23:54 crc kubenswrapper[4763]: I1006 16:23:54.114685 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8565f7649c-scn54" Oct 06 16:23:54 crc kubenswrapper[4763]: I1006 16:23:54.146491 4763 scope.go:117] "RemoveContainer" containerID="fecde02cc50d8ffcfc53a83d9581e3d5547a56583c6b35c011f6eda99d1ca6ac" Oct 06 16:23:54 crc kubenswrapper[4763]: I1006 16:23:54.148520 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-scn54"] Oct 06 16:23:54 crc kubenswrapper[4763]: I1006 16:23:54.157552 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-scn54"] Oct 06 16:23:55 crc kubenswrapper[4763]: I1006 16:23:55.586993 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef41853-eb12-41bf-ae34-2f46a8538650" path="/var/lib/kubelet/pods/aef41853-eb12-41bf-ae34-2f46a8538650/volumes" Oct 06 16:23:56 crc kubenswrapper[4763]: I1006 16:23:56.575021 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:23:56 crc kubenswrapper[4763]: E1006 16:23:56.575607 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:24:11 crc kubenswrapper[4763]: I1006 16:24:11.576559 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:24:11 crc kubenswrapper[4763]: E1006 16:24:11.577598 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:24:17 crc kubenswrapper[4763]: I1006 16:24:17.996792 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:24:18 crc kubenswrapper[4763]: I1006 16:24:18.007241 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-fc4965fb-jjbpd" Oct 06 16:24:25 crc kubenswrapper[4763]: I1006 16:24:25.575393 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:24:25 crc kubenswrapper[4763]: E1006 16:24:25.576128 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:24:29 crc kubenswrapper[4763]: I1006 16:24:29.754670 4763 scope.go:117] "RemoveContainer" containerID="24b9a114c51fa39b3cbd76bddf067565d33db89070eff3940cf846b080a10460" Oct 06 16:24:36 crc kubenswrapper[4763]: I1006 16:24:36.575441 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:24:36 crc kubenswrapper[4763]: E1006 16:24:36.576092 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.301053 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vmffx"] Oct 06 16:24:42 crc kubenswrapper[4763]: E1006 16:24:42.301902 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef41853-eb12-41bf-ae34-2f46a8538650" containerName="init" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.301915 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef41853-eb12-41bf-ae34-2f46a8538650" containerName="init" Oct 06 16:24:42 crc kubenswrapper[4763]: E1006 16:24:42.301950 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef41853-eb12-41bf-ae34-2f46a8538650" containerName="dnsmasq-dns" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.301956 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef41853-eb12-41bf-ae34-2f46a8538650" containerName="dnsmasq-dns" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.302158 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef41853-eb12-41bf-ae34-2f46a8538650" containerName="dnsmasq-dns" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.302761 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vmffx" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.324873 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vmffx"] Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.395810 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-fz6gw"] Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.397113 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fz6gw" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.405079 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fz6gw"] Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.444605 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lbbv\" (UniqueName: \"kubernetes.io/projected/774c046f-c1cb-440b-9e1e-3515d4d41738-kube-api-access-4lbbv\") pod \"nova-api-db-create-vmffx\" (UID: \"774c046f-c1cb-440b-9e1e-3515d4d41738\") " pod="openstack/nova-api-db-create-vmffx" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.509461 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-7f98c"] Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.510946 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7f98c" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.526199 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7f98c"] Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.546156 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxtlx\" (UniqueName: \"kubernetes.io/projected/49b733de-b169-483e-b598-9767d7a7f76d-kube-api-access-jxtlx\") pod \"nova-cell0-db-create-fz6gw\" (UID: \"49b733de-b169-483e-b598-9767d7a7f76d\") " pod="openstack/nova-cell0-db-create-fz6gw" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.546243 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lbbv\" (UniqueName: \"kubernetes.io/projected/774c046f-c1cb-440b-9e1e-3515d4d41738-kube-api-access-4lbbv\") pod \"nova-api-db-create-vmffx\" (UID: \"774c046f-c1cb-440b-9e1e-3515d4d41738\") " pod="openstack/nova-api-db-create-vmffx" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.569107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lbbv\" (UniqueName: \"kubernetes.io/projected/774c046f-c1cb-440b-9e1e-3515d4d41738-kube-api-access-4lbbv\") pod \"nova-api-db-create-vmffx\" (UID: \"774c046f-c1cb-440b-9e1e-3515d4d41738\") " pod="openstack/nova-api-db-create-vmffx" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.623050 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vmffx" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.647184 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sncjv\" (UniqueName: \"kubernetes.io/projected/33bdf8ee-5273-48e3-9c11-bd14bb06998a-kube-api-access-sncjv\") pod \"nova-cell1-db-create-7f98c\" (UID: \"33bdf8ee-5273-48e3-9c11-bd14bb06998a\") " pod="openstack/nova-cell1-db-create-7f98c" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.647295 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxtlx\" (UniqueName: \"kubernetes.io/projected/49b733de-b169-483e-b598-9767d7a7f76d-kube-api-access-jxtlx\") pod \"nova-cell0-db-create-fz6gw\" (UID: \"49b733de-b169-483e-b598-9767d7a7f76d\") " pod="openstack/nova-cell0-db-create-fz6gw" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.664339 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxtlx\" (UniqueName: \"kubernetes.io/projected/49b733de-b169-483e-b598-9767d7a7f76d-kube-api-access-jxtlx\") pod \"nova-cell0-db-create-fz6gw\" (UID: \"49b733de-b169-483e-b598-9767d7a7f76d\") " pod="openstack/nova-cell0-db-create-fz6gw" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.714153 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fz6gw" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.753856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sncjv\" (UniqueName: \"kubernetes.io/projected/33bdf8ee-5273-48e3-9c11-bd14bb06998a-kube-api-access-sncjv\") pod \"nova-cell1-db-create-7f98c\" (UID: \"33bdf8ee-5273-48e3-9c11-bd14bb06998a\") " pod="openstack/nova-cell1-db-create-7f98c" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.817569 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sncjv\" (UniqueName: \"kubernetes.io/projected/33bdf8ee-5273-48e3-9c11-bd14bb06998a-kube-api-access-sncjv\") pod \"nova-cell1-db-create-7f98c\" (UID: \"33bdf8ee-5273-48e3-9c11-bd14bb06998a\") " pod="openstack/nova-cell1-db-create-7f98c" Oct 06 16:24:42 crc kubenswrapper[4763]: I1006 16:24:42.828056 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7f98c" Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.207905 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vmffx"] Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.328746 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fz6gw"] Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.337374 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7f98c"] Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.558391 4763 generic.go:334] "Generic (PLEG): container finished" podID="774c046f-c1cb-440b-9e1e-3515d4d41738" containerID="5e5bc6ecc7df58d82d9bf9e17179fb87d491bbcfd54767575ca65d1ea3ee58a7" exitCode=0 Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.558479 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vmffx" event={"ID":"774c046f-c1cb-440b-9e1e-3515d4d41738","Type":"ContainerDied","Data":"5e5bc6ecc7df58d82d9bf9e17179fb87d491bbcfd54767575ca65d1ea3ee58a7"} Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.558819 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vmffx" event={"ID":"774c046f-c1cb-440b-9e1e-3515d4d41738","Type":"ContainerStarted","Data":"0a7b2d29f22622a425556bd9d922fc3d21614e3cf6866ac248d06c250e57621c"} Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.561575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fz6gw" event={"ID":"49b733de-b169-483e-b598-9767d7a7f76d","Type":"ContainerStarted","Data":"96274dc5d4ab3bf3fd5e3b2f28ed2071a6a2afc83207c30295342e22f1b19bc8"} Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.561634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fz6gw" event={"ID":"49b733de-b169-483e-b598-9767d7a7f76d","Type":"ContainerStarted","Data":"3ab424ba2e1b8cbba6ad3f927f108e1d1b7e79062e2a68953c5c81ba3d4926b8"} Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.563311 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7f98c" event={"ID":"33bdf8ee-5273-48e3-9c11-bd14bb06998a","Type":"ContainerStarted","Data":"f1f419461efdedbd202967e64c87778a6aa491c1ccc56a07ba0f8336180b15a5"} Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.563343 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7f98c" event={"ID":"33bdf8ee-5273-48e3-9c11-bd14bb06998a","Type":"ContainerStarted","Data":"e1e0053a7d2e43ce76624873427cee9a5b7d18518c23a93aea38e957443c0bf8"} Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.597584 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-fz6gw" podStartSLOduration=1.597566271 podStartE2EDuration="1.597566271s" podCreationTimestamp="2025-10-06 16:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:24:43.588490678 +0000 UTC m=+5480.743783180" watchObservedRunningTime="2025-10-06 16:24:43.597566271 +0000 UTC m=+5480.752858793" Oct 06 16:24:43 crc kubenswrapper[4763]: I1006 16:24:43.612926 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-7f98c" podStartSLOduration=1.6129060819999999 podStartE2EDuration="1.612906082s" podCreationTimestamp="2025-10-06 16:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:24:43.602833372 +0000 UTC m=+5480.758125884" watchObservedRunningTime="2025-10-06 16:24:43.612906082 +0000 UTC m=+5480.768198594" Oct 06 16:24:44 crc kubenswrapper[4763]: I1006 16:24:44.575301 4763 generic.go:334] "Generic (PLEG): container finished" podID="33bdf8ee-5273-48e3-9c11-bd14bb06998a" containerID="f1f419461efdedbd202967e64c87778a6aa491c1ccc56a07ba0f8336180b15a5" exitCode=0 Oct 06 16:24:44 crc kubenswrapper[4763]: I1006 16:24:44.575419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7f98c" event={"ID":"33bdf8ee-5273-48e3-9c11-bd14bb06998a","Type":"ContainerDied","Data":"f1f419461efdedbd202967e64c87778a6aa491c1ccc56a07ba0f8336180b15a5"} Oct 06 16:24:44 crc kubenswrapper[4763]: I1006 16:24:44.577934 4763 generic.go:334] "Generic (PLEG): container finished" podID="49b733de-b169-483e-b598-9767d7a7f76d" containerID="96274dc5d4ab3bf3fd5e3b2f28ed2071a6a2afc83207c30295342e22f1b19bc8" exitCode=0 Oct 06 16:24:44 crc kubenswrapper[4763]: I1006 16:24:44.578029 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fz6gw" event={"ID":"49b733de-b169-483e-b598-9767d7a7f76d","Type":"ContainerDied","Data":"96274dc5d4ab3bf3fd5e3b2f28ed2071a6a2afc83207c30295342e22f1b19bc8"} Oct 06 16:24:44 crc kubenswrapper[4763]: I1006 16:24:44.967847 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vmffx" Oct 06 16:24:45 crc kubenswrapper[4763]: I1006 16:24:45.121822 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lbbv\" (UniqueName: \"kubernetes.io/projected/774c046f-c1cb-440b-9e1e-3515d4d41738-kube-api-access-4lbbv\") pod \"774c046f-c1cb-440b-9e1e-3515d4d41738\" (UID: \"774c046f-c1cb-440b-9e1e-3515d4d41738\") " Oct 06 16:24:45 crc kubenswrapper[4763]: I1006 16:24:45.136394 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/774c046f-c1cb-440b-9e1e-3515d4d41738-kube-api-access-4lbbv" (OuterVolumeSpecName: "kube-api-access-4lbbv") pod "774c046f-c1cb-440b-9e1e-3515d4d41738" (UID: "774c046f-c1cb-440b-9e1e-3515d4d41738"). InnerVolumeSpecName "kube-api-access-4lbbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:24:45 crc kubenswrapper[4763]: I1006 16:24:45.223247 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lbbv\" (UniqueName: \"kubernetes.io/projected/774c046f-c1cb-440b-9e1e-3515d4d41738-kube-api-access-4lbbv\") on node \"crc\" DevicePath \"\"" Oct 06 16:24:45 crc kubenswrapper[4763]: I1006 16:24:45.589493 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vmffx" Oct 06 16:24:45 crc kubenswrapper[4763]: I1006 16:24:45.594999 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vmffx" event={"ID":"774c046f-c1cb-440b-9e1e-3515d4d41738","Type":"ContainerDied","Data":"0a7b2d29f22622a425556bd9d922fc3d21614e3cf6866ac248d06c250e57621c"} Oct 06 16:24:45 crc kubenswrapper[4763]: I1006 16:24:45.595045 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a7b2d29f22622a425556bd9d922fc3d21614e3cf6866ac248d06c250e57621c" Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.007199 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fz6gw" Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.011730 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7f98c" Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.141375 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxtlx\" (UniqueName: \"kubernetes.io/projected/49b733de-b169-483e-b598-9767d7a7f76d-kube-api-access-jxtlx\") pod \"49b733de-b169-483e-b598-9767d7a7f76d\" (UID: \"49b733de-b169-483e-b598-9767d7a7f76d\") " Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.141425 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sncjv\" (UniqueName: \"kubernetes.io/projected/33bdf8ee-5273-48e3-9c11-bd14bb06998a-kube-api-access-sncjv\") pod \"33bdf8ee-5273-48e3-9c11-bd14bb06998a\" (UID: \"33bdf8ee-5273-48e3-9c11-bd14bb06998a\") " Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.145560 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b733de-b169-483e-b598-9767d7a7f76d-kube-api-access-jxtlx" (OuterVolumeSpecName: "kube-api-access-jxtlx") pod "49b733de-b169-483e-b598-9767d7a7f76d" (UID: "49b733de-b169-483e-b598-9767d7a7f76d"). InnerVolumeSpecName "kube-api-access-jxtlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.145975 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33bdf8ee-5273-48e3-9c11-bd14bb06998a-kube-api-access-sncjv" (OuterVolumeSpecName: "kube-api-access-sncjv") pod "33bdf8ee-5273-48e3-9c11-bd14bb06998a" (UID: "33bdf8ee-5273-48e3-9c11-bd14bb06998a"). InnerVolumeSpecName "kube-api-access-sncjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.243034 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxtlx\" (UniqueName: \"kubernetes.io/projected/49b733de-b169-483e-b598-9767d7a7f76d-kube-api-access-jxtlx\") on node \"crc\" DevicePath \"\"" Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.243066 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sncjv\" (UniqueName: \"kubernetes.io/projected/33bdf8ee-5273-48e3-9c11-bd14bb06998a-kube-api-access-sncjv\") on node \"crc\" DevicePath \"\"" Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.599845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fz6gw" event={"ID":"49b733de-b169-483e-b598-9767d7a7f76d","Type":"ContainerDied","Data":"3ab424ba2e1b8cbba6ad3f927f108e1d1b7e79062e2a68953c5c81ba3d4926b8"} Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.600795 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ab424ba2e1b8cbba6ad3f927f108e1d1b7e79062e2a68953c5c81ba3d4926b8" Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.599898 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fz6gw" Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.601576 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7f98c" event={"ID":"33bdf8ee-5273-48e3-9c11-bd14bb06998a","Type":"ContainerDied","Data":"e1e0053a7d2e43ce76624873427cee9a5b7d18518c23a93aea38e957443c0bf8"} Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.601603 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1e0053a7d2e43ce76624873427cee9a5b7d18518c23a93aea38e957443c0bf8" Oct 06 16:24:46 crc kubenswrapper[4763]: I1006 16:24:46.601699 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7f98c" Oct 06 16:24:49 crc kubenswrapper[4763]: I1006 16:24:49.575565 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:24:49 crc kubenswrapper[4763]: E1006 16:24:49.576404 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.530505 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d6c8-account-create-5rvxp"] Oct 06 16:24:52 crc kubenswrapper[4763]: E1006 16:24:52.531271 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774c046f-c1cb-440b-9e1e-3515d4d41738" containerName="mariadb-database-create" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.531288 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="774c046f-c1cb-440b-9e1e-3515d4d41738" containerName="mariadb-database-create" Oct 06 16:24:52 crc kubenswrapper[4763]: E1006 16:24:52.531309 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b733de-b169-483e-b598-9767d7a7f76d" containerName="mariadb-database-create" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.531317 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b733de-b169-483e-b598-9767d7a7f76d" containerName="mariadb-database-create" Oct 06 16:24:52 crc kubenswrapper[4763]: E1006 16:24:52.531340 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bdf8ee-5273-48e3-9c11-bd14bb06998a" containerName="mariadb-database-create" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.531349 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bdf8ee-5273-48e3-9c11-bd14bb06998a" containerName="mariadb-database-create" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.531553 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b733de-b169-483e-b598-9767d7a7f76d" containerName="mariadb-database-create" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.531582 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="774c046f-c1cb-440b-9e1e-3515d4d41738" containerName="mariadb-database-create" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.531602 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bdf8ee-5273-48e3-9c11-bd14bb06998a" containerName="mariadb-database-create" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.532310 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d6c8-account-create-5rvxp" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.542347 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d6c8-account-create-5rvxp"] Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.543329 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.628663 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2816-account-create-mblr6"] Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.629862 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2816-account-create-mblr6" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.631664 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.638446 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2816-account-create-mblr6"] Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.679567 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mffhf\" (UniqueName: \"kubernetes.io/projected/ba3f3df5-ca95-4209-abc7-84150d8acb0f-kube-api-access-mffhf\") pod \"nova-cell0-2816-account-create-mblr6\" (UID: \"ba3f3df5-ca95-4209-abc7-84150d8acb0f\") " pod="openstack/nova-cell0-2816-account-create-mblr6" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.679607 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bq2m\" (UniqueName: \"kubernetes.io/projected/379142c1-b420-458d-bc29-1102ef3ff096-kube-api-access-6bq2m\") pod \"nova-api-d6c8-account-create-5rvxp\" (UID: \"379142c1-b420-458d-bc29-1102ef3ff096\") " pod="openstack/nova-api-d6c8-account-create-5rvxp" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.780805 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mffhf\" (UniqueName: \"kubernetes.io/projected/ba3f3df5-ca95-4209-abc7-84150d8acb0f-kube-api-access-mffhf\") pod \"nova-cell0-2816-account-create-mblr6\" (UID: \"ba3f3df5-ca95-4209-abc7-84150d8acb0f\") " pod="openstack/nova-cell0-2816-account-create-mblr6" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.780856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bq2m\" (UniqueName: \"kubernetes.io/projected/379142c1-b420-458d-bc29-1102ef3ff096-kube-api-access-6bq2m\") pod \"nova-api-d6c8-account-create-5rvxp\" (UID: \"379142c1-b420-458d-bc29-1102ef3ff096\") " pod="openstack/nova-api-d6c8-account-create-5rvxp" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.805984 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bq2m\" (UniqueName: \"kubernetes.io/projected/379142c1-b420-458d-bc29-1102ef3ff096-kube-api-access-6bq2m\") pod \"nova-api-d6c8-account-create-5rvxp\" (UID: \"379142c1-b420-458d-bc29-1102ef3ff096\") " pod="openstack/nova-api-d6c8-account-create-5rvxp" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.806219 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mffhf\" (UniqueName: \"kubernetes.io/projected/ba3f3df5-ca95-4209-abc7-84150d8acb0f-kube-api-access-mffhf\") pod \"nova-cell0-2816-account-create-mblr6\" (UID: \"ba3f3df5-ca95-4209-abc7-84150d8acb0f\") " pod="openstack/nova-cell0-2816-account-create-mblr6" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.832087 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-89be-account-create-j47ck"] Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.833396 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-89be-account-create-j47ck" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.835677 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.841367 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-89be-account-create-j47ck"] Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.872215 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d6c8-account-create-5rvxp" Oct 06 16:24:52 crc kubenswrapper[4763]: I1006 16:24:52.946958 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2816-account-create-mblr6" Oct 06 16:24:53 crc kubenswrapper[4763]: I1006 16:24:52.984098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5kfn\" (UniqueName: \"kubernetes.io/projected/98f21252-6e8d-4f97-a3fa-c3f18afee924-kube-api-access-d5kfn\") pod \"nova-cell1-89be-account-create-j47ck\" (UID: \"98f21252-6e8d-4f97-a3fa-c3f18afee924\") " pod="openstack/nova-cell1-89be-account-create-j47ck" Oct 06 16:24:53 crc kubenswrapper[4763]: I1006 16:24:53.085694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5kfn\" (UniqueName: \"kubernetes.io/projected/98f21252-6e8d-4f97-a3fa-c3f18afee924-kube-api-access-d5kfn\") pod \"nova-cell1-89be-account-create-j47ck\" (UID: \"98f21252-6e8d-4f97-a3fa-c3f18afee924\") " pod="openstack/nova-cell1-89be-account-create-j47ck" Oct 06 16:24:53 crc kubenswrapper[4763]: I1006 16:24:53.105935 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5kfn\" (UniqueName: \"kubernetes.io/projected/98f21252-6e8d-4f97-a3fa-c3f18afee924-kube-api-access-d5kfn\") pod \"nova-cell1-89be-account-create-j47ck\" (UID: \"98f21252-6e8d-4f97-a3fa-c3f18afee924\") " pod="openstack/nova-cell1-89be-account-create-j47ck" Oct 06 16:24:53 crc kubenswrapper[4763]: I1006 16:24:53.320000 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-89be-account-create-j47ck" Oct 06 16:24:53 crc kubenswrapper[4763]: I1006 16:24:53.872668 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d6c8-account-create-5rvxp"] Oct 06 16:24:53 crc kubenswrapper[4763]: I1006 16:24:53.929837 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2816-account-create-mblr6"] Oct 06 16:24:53 crc kubenswrapper[4763]: W1006 16:24:53.930681 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba3f3df5_ca95_4209_abc7_84150d8acb0f.slice/crio-6b7c2135bd4873eb317b92ea5ccb914f497a7e784ab59e9f4d71467b9ab50225 WatchSource:0}: Error finding container 6b7c2135bd4873eb317b92ea5ccb914f497a7e784ab59e9f4d71467b9ab50225: Status 404 returned error can't find the container with id 6b7c2135bd4873eb317b92ea5ccb914f497a7e784ab59e9f4d71467b9ab50225 Oct 06 16:24:53 crc kubenswrapper[4763]: W1006 16:24:53.940260 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98f21252_6e8d_4f97_a3fa_c3f18afee924.slice/crio-cabfdc6e77d8e3bac0ce1d135aa726a17a64faf602dd9e07430f68138c602fbb WatchSource:0}: Error finding container cabfdc6e77d8e3bac0ce1d135aa726a17a64faf602dd9e07430f68138c602fbb: Status 404 returned error can't find the container with id cabfdc6e77d8e3bac0ce1d135aa726a17a64faf602dd9e07430f68138c602fbb Oct 06 16:24:53 crc kubenswrapper[4763]: I1006 16:24:53.940695 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-89be-account-create-j47ck"] Oct 06 16:24:54 crc kubenswrapper[4763]: I1006 16:24:54.681438 4763 generic.go:334] "Generic (PLEG): container finished" podID="379142c1-b420-458d-bc29-1102ef3ff096" containerID="f76aa74d43b76d76c38989ea818a0dd1788cb57cfa903bc157b89141b9c66786" exitCode=0 Oct 06 16:24:54 crc kubenswrapper[4763]: I1006 16:24:54.681660 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d6c8-account-create-5rvxp" event={"ID":"379142c1-b420-458d-bc29-1102ef3ff096","Type":"ContainerDied","Data":"f76aa74d43b76d76c38989ea818a0dd1788cb57cfa903bc157b89141b9c66786"} Oct 06 16:24:54 crc kubenswrapper[4763]: I1006 16:24:54.681934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d6c8-account-create-5rvxp" event={"ID":"379142c1-b420-458d-bc29-1102ef3ff096","Type":"ContainerStarted","Data":"e7655e824f978667f8796c21996fae61c6e01eb713827d425b1f2e569d935aa9"} Oct 06 16:24:54 crc kubenswrapper[4763]: I1006 16:24:54.684347 4763 generic.go:334] "Generic (PLEG): container finished" podID="98f21252-6e8d-4f97-a3fa-c3f18afee924" containerID="e124679b91b485012312872ea98351c7927f485535fdc5d4bbf79f325ccef9a7" exitCode=0 Oct 06 16:24:54 crc kubenswrapper[4763]: I1006 16:24:54.684416 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-89be-account-create-j47ck" event={"ID":"98f21252-6e8d-4f97-a3fa-c3f18afee924","Type":"ContainerDied","Data":"e124679b91b485012312872ea98351c7927f485535fdc5d4bbf79f325ccef9a7"} Oct 06 16:24:54 crc kubenswrapper[4763]: I1006 16:24:54.684467 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-89be-account-create-j47ck" event={"ID":"98f21252-6e8d-4f97-a3fa-c3f18afee924","Type":"ContainerStarted","Data":"cabfdc6e77d8e3bac0ce1d135aa726a17a64faf602dd9e07430f68138c602fbb"} Oct 06 16:24:54 crc kubenswrapper[4763]: I1006 16:24:54.688079 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba3f3df5-ca95-4209-abc7-84150d8acb0f" containerID="1348fb122ebd3a676b1ba3eff24139d3621803eda8fe10fa277dd55f53ad0a56" exitCode=0 Oct 06 16:24:54 crc kubenswrapper[4763]: I1006 16:24:54.688131 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2816-account-create-mblr6" event={"ID":"ba3f3df5-ca95-4209-abc7-84150d8acb0f","Type":"ContainerDied","Data":"1348fb122ebd3a676b1ba3eff24139d3621803eda8fe10fa277dd55f53ad0a56"} Oct 06 16:24:54 crc kubenswrapper[4763]: I1006 16:24:54.688164 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2816-account-create-mblr6" event={"ID":"ba3f3df5-ca95-4209-abc7-84150d8acb0f","Type":"ContainerStarted","Data":"6b7c2135bd4873eb317b92ea5ccb914f497a7e784ab59e9f4d71467b9ab50225"} Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.206165 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2816-account-create-mblr6" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.212595 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-89be-account-create-j47ck" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.220099 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d6c8-account-create-5rvxp" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.245808 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5kfn\" (UniqueName: \"kubernetes.io/projected/98f21252-6e8d-4f97-a3fa-c3f18afee924-kube-api-access-d5kfn\") pod \"98f21252-6e8d-4f97-a3fa-c3f18afee924\" (UID: \"98f21252-6e8d-4f97-a3fa-c3f18afee924\") " Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.246042 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mffhf\" (UniqueName: \"kubernetes.io/projected/ba3f3df5-ca95-4209-abc7-84150d8acb0f-kube-api-access-mffhf\") pod \"ba3f3df5-ca95-4209-abc7-84150d8acb0f\" (UID: \"ba3f3df5-ca95-4209-abc7-84150d8acb0f\") " Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.246078 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bq2m\" (UniqueName: \"kubernetes.io/projected/379142c1-b420-458d-bc29-1102ef3ff096-kube-api-access-6bq2m\") pod \"379142c1-b420-458d-bc29-1102ef3ff096\" (UID: \"379142c1-b420-458d-bc29-1102ef3ff096\") " Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.255958 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3f3df5-ca95-4209-abc7-84150d8acb0f-kube-api-access-mffhf" (OuterVolumeSpecName: "kube-api-access-mffhf") pod "ba3f3df5-ca95-4209-abc7-84150d8acb0f" (UID: "ba3f3df5-ca95-4209-abc7-84150d8acb0f"). InnerVolumeSpecName "kube-api-access-mffhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.261236 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379142c1-b420-458d-bc29-1102ef3ff096-kube-api-access-6bq2m" (OuterVolumeSpecName: "kube-api-access-6bq2m") pod "379142c1-b420-458d-bc29-1102ef3ff096" (UID: "379142c1-b420-458d-bc29-1102ef3ff096"). InnerVolumeSpecName "kube-api-access-6bq2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.272237 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f21252-6e8d-4f97-a3fa-c3f18afee924-kube-api-access-d5kfn" (OuterVolumeSpecName: "kube-api-access-d5kfn") pod "98f21252-6e8d-4f97-a3fa-c3f18afee924" (UID: "98f21252-6e8d-4f97-a3fa-c3f18afee924"). InnerVolumeSpecName "kube-api-access-d5kfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.347638 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mffhf\" (UniqueName: \"kubernetes.io/projected/ba3f3df5-ca95-4209-abc7-84150d8acb0f-kube-api-access-mffhf\") on node \"crc\" DevicePath \"\"" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.347667 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bq2m\" (UniqueName: \"kubernetes.io/projected/379142c1-b420-458d-bc29-1102ef3ff096-kube-api-access-6bq2m\") on node \"crc\" DevicePath \"\"" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.347678 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5kfn\" (UniqueName: \"kubernetes.io/projected/98f21252-6e8d-4f97-a3fa-c3f18afee924-kube-api-access-d5kfn\") on node \"crc\" DevicePath \"\"" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.713392 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d6c8-account-create-5rvxp" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.713721 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d6c8-account-create-5rvxp" event={"ID":"379142c1-b420-458d-bc29-1102ef3ff096","Type":"ContainerDied","Data":"e7655e824f978667f8796c21996fae61c6e01eb713827d425b1f2e569d935aa9"} Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.713765 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7655e824f978667f8796c21996fae61c6e01eb713827d425b1f2e569d935aa9" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.715717 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-89be-account-create-j47ck" event={"ID":"98f21252-6e8d-4f97-a3fa-c3f18afee924","Type":"ContainerDied","Data":"cabfdc6e77d8e3bac0ce1d135aa726a17a64faf602dd9e07430f68138c602fbb"} Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.715741 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cabfdc6e77d8e3bac0ce1d135aa726a17a64faf602dd9e07430f68138c602fbb" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.715795 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-89be-account-create-j47ck" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.717541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2816-account-create-mblr6" event={"ID":"ba3f3df5-ca95-4209-abc7-84150d8acb0f","Type":"ContainerDied","Data":"6b7c2135bd4873eb317b92ea5ccb914f497a7e784ab59e9f4d71467b9ab50225"} Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.717569 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b7c2135bd4873eb317b92ea5ccb914f497a7e784ab59e9f4d71467b9ab50225" Oct 06 16:24:56 crc kubenswrapper[4763]: I1006 16:24:56.717570 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2816-account-create-mblr6" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.845487 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jq6fm"] Oct 06 16:24:57 crc kubenswrapper[4763]: E1006 16:24:57.846079 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3f3df5-ca95-4209-abc7-84150d8acb0f" containerName="mariadb-account-create" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.846090 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3f3df5-ca95-4209-abc7-84150d8acb0f" containerName="mariadb-account-create" Oct 06 16:24:57 crc kubenswrapper[4763]: E1006 16:24:57.846116 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379142c1-b420-458d-bc29-1102ef3ff096" containerName="mariadb-account-create" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.846122 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="379142c1-b420-458d-bc29-1102ef3ff096" containerName="mariadb-account-create" Oct 06 16:24:57 crc kubenswrapper[4763]: E1006 16:24:57.846133 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f21252-6e8d-4f97-a3fa-c3f18afee924" containerName="mariadb-account-create" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.846138 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f21252-6e8d-4f97-a3fa-c3f18afee924" containerName="mariadb-account-create" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.846318 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="379142c1-b420-458d-bc29-1102ef3ff096" containerName="mariadb-account-create" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.846331 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f21252-6e8d-4f97-a3fa-c3f18afee924" containerName="mariadb-account-create" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.846342 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3f3df5-ca95-4209-abc7-84150d8acb0f" containerName="mariadb-account-create" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.846918 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.849018 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-76tp2" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.849721 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.849875 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.861914 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jq6fm"] Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.975112 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-config-data\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.975193 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.975267 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxcrs\" (UniqueName: \"kubernetes.io/projected/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-kube-api-access-gxcrs\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:57 crc kubenswrapper[4763]: I1006 16:24:57.975321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-scripts\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.077347 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxcrs\" (UniqueName: \"kubernetes.io/projected/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-kube-api-access-gxcrs\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.077466 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-scripts\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.077653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-config-data\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.077727 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.082847 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.082847 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-scripts\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.084606 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-config-data\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.105748 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxcrs\" (UniqueName: \"kubernetes.io/projected/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-kube-api-access-gxcrs\") pod \"nova-cell0-conductor-db-sync-jq6fm\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.175055 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.633470 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jq6fm"] Oct 06 16:24:58 crc kubenswrapper[4763]: I1006 16:24:58.750997 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jq6fm" event={"ID":"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace","Type":"ContainerStarted","Data":"8de3e7680bd45ea43a34be5abb5d50447b53efa1110ede036ae814da821db87c"} Oct 06 16:24:59 crc kubenswrapper[4763]: I1006 16:24:59.761964 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jq6fm" event={"ID":"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace","Type":"ContainerStarted","Data":"7c01df081de2b96a4a52010e21a640be99247524173b40054f496c8248373aa0"} Oct 06 16:24:59 crc kubenswrapper[4763]: I1006 16:24:59.792247 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jq6fm" podStartSLOduration=2.792225428 podStartE2EDuration="2.792225428s" podCreationTimestamp="2025-10-06 16:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:24:59.785695963 +0000 UTC m=+5496.940988525" watchObservedRunningTime="2025-10-06 16:24:59.792225428 +0000 UTC m=+5496.947517940" Oct 06 16:25:00 crc kubenswrapper[4763]: I1006 16:25:00.575342 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:25:00 crc kubenswrapper[4763]: E1006 16:25:00.575929 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:25:03 crc kubenswrapper[4763]: I1006 16:25:03.802246 4763 generic.go:334] "Generic (PLEG): container finished" podID="1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace" containerID="7c01df081de2b96a4a52010e21a640be99247524173b40054f496c8248373aa0" exitCode=0 Oct 06 16:25:03 crc kubenswrapper[4763]: I1006 16:25:03.802681 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jq6fm" event={"ID":"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace","Type":"ContainerDied","Data":"7c01df081de2b96a4a52010e21a640be99247524173b40054f496c8248373aa0"} Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.140603 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.229401 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-combined-ca-bundle\") pod \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.229514 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-config-data\") pod \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.229553 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-scripts\") pod \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.229592 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxcrs\" (UniqueName: \"kubernetes.io/projected/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-kube-api-access-gxcrs\") pod \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\" (UID: \"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace\") " Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.234740 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-scripts" (OuterVolumeSpecName: "scripts") pod "1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace" (UID: "1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.234798 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-kube-api-access-gxcrs" (OuterVolumeSpecName: "kube-api-access-gxcrs") pod "1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace" (UID: "1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace"). InnerVolumeSpecName "kube-api-access-gxcrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.255956 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-config-data" (OuterVolumeSpecName: "config-data") pod "1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace" (UID: "1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.265206 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace" (UID: "1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.332155 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.332189 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.332202 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxcrs\" (UniqueName: \"kubernetes.io/projected/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-kube-api-access-gxcrs\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.332216 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.838504 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jq6fm" event={"ID":"1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace","Type":"ContainerDied","Data":"8de3e7680bd45ea43a34be5abb5d50447b53efa1110ede036ae814da821db87c"} Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.839012 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8de3e7680bd45ea43a34be5abb5d50447b53efa1110ede036ae814da821db87c" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.838587 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jq6fm" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.906866 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 16:25:05 crc kubenswrapper[4763]: E1006 16:25:05.907252 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace" containerName="nova-cell0-conductor-db-sync" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.907276 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace" containerName="nova-cell0-conductor-db-sync" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.907505 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace" containerName="nova-cell0-conductor-db-sync" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.908236 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.910443 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.910858 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-76tp2" Oct 06 16:25:05 crc kubenswrapper[4763]: I1006 16:25:05.918728 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.047423 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.047562 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.048235 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwn9m\" (UniqueName: \"kubernetes.io/projected/9ed10075-b98c-4856-9515-d2ee60040058-kube-api-access-nwn9m\") pod \"nova-cell0-conductor-0\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.149452 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.149517 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.149622 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwn9m\" (UniqueName: \"kubernetes.io/projected/9ed10075-b98c-4856-9515-d2ee60040058-kube-api-access-nwn9m\") pod \"nova-cell0-conductor-0\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.155989 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.156807 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.170957 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwn9m\" (UniqueName: \"kubernetes.io/projected/9ed10075-b98c-4856-9515-d2ee60040058-kube-api-access-nwn9m\") pod \"nova-cell0-conductor-0\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.240020 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.682213 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 16:25:06 crc kubenswrapper[4763]: I1006 16:25:06.853717 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9ed10075-b98c-4856-9515-d2ee60040058","Type":"ContainerStarted","Data":"c27924aa5eb68d57e7dbf19fca4b7f6032f251339dbbed5969739667e94fb254"} Oct 06 16:25:07 crc kubenswrapper[4763]: I1006 16:25:07.866017 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9ed10075-b98c-4856-9515-d2ee60040058","Type":"ContainerStarted","Data":"09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c"} Oct 06 16:25:07 crc kubenswrapper[4763]: I1006 16:25:07.866823 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:07 crc kubenswrapper[4763]: I1006 16:25:07.899505 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.8994723540000003 podStartE2EDuration="2.899472354s" podCreationTimestamp="2025-10-06 16:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:07.889357793 +0000 UTC m=+5505.044650335" watchObservedRunningTime="2025-10-06 16:25:07.899472354 +0000 UTC m=+5505.054764906" Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.279779 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.575363 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:25:11 crc kubenswrapper[4763]: E1006 16:25:11.575760 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.849563 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-ldxd4"] Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.852266 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.855164 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.855393 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.893012 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ldxd4"] Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.964523 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.964638 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhlmh\" (UniqueName: \"kubernetes.io/projected/19d391b2-f609-4a0e-857a-5c64ab6168f5-kube-api-access-bhlmh\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.964681 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-scripts\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:11 crc kubenswrapper[4763]: I1006 16:25:11.964720 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-config-data\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.054939 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.073115 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhlmh\" (UniqueName: \"kubernetes.io/projected/19d391b2-f609-4a0e-857a-5c64ab6168f5-kube-api-access-bhlmh\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.073200 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-scripts\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.073256 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-config-data\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.073321 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.108165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.122040 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.122199 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.144802 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-scripts\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.158326 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.162242 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhlmh\" (UniqueName: \"kubernetes.io/projected/19d391b2-f609-4a0e-857a-5c64ab6168f5-kube-api-access-bhlmh\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.166552 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-config-data\") pod \"nova-cell0-cell-mapping-ldxd4\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.167962 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.170733 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.183242 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.183971 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.217337 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.224842 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.226669 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.230351 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.237750 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.278849 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c13798-76e6-4850-86ff-535885da0f9d-logs\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.281523 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkqxw\" (UniqueName: \"kubernetes.io/projected/d856fcef-f614-4291-818c-3f77282f5940-kube-api-access-lkqxw\") pod \"nova-cell1-novncproxy-0\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.281753 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.281829 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.282092 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.282147 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knfmv\" (UniqueName: \"kubernetes.io/projected/e2c13798-76e6-4850-86ff-535885da0f9d-kube-api-access-knfmv\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.282254 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-config-data\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.330803 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.332553 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.341057 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.354496 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.393771 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.393812 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-config-data\") pod \"nova-scheduler-0\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.393844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.393865 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzqnr\" (UniqueName: \"kubernetes.io/projected/dd162699-19f0-4808-9b7d-f916d3796d34-kube-api-access-kzqnr\") pod \"nova-scheduler-0\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.393883 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.393901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knfmv\" (UniqueName: \"kubernetes.io/projected/e2c13798-76e6-4850-86ff-535885da0f9d-kube-api-access-knfmv\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.393942 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-config-data\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.393965 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c13798-76e6-4850-86ff-535885da0f9d-logs\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.394007 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkqxw\" (UniqueName: \"kubernetes.io/projected/d856fcef-f614-4291-818c-3f77282f5940-kube-api-access-lkqxw\") pod \"nova-cell1-novncproxy-0\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.394046 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.398565 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.399120 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c13798-76e6-4850-86ff-535885da0f9d-logs\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.404779 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.406686 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-config-data\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.410691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.421099 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-s6l89"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.423429 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.434948 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knfmv\" (UniqueName: \"kubernetes.io/projected/e2c13798-76e6-4850-86ff-535885da0f9d-kube-api-access-knfmv\") pod \"nova-api-0\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.435199 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-s6l89"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.435413 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkqxw\" (UniqueName: \"kubernetes.io/projected/d856fcef-f614-4291-818c-3f77282f5940-kube-api-access-lkqxw\") pod \"nova-cell1-novncproxy-0\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.495116 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pclgn\" (UniqueName: \"kubernetes.io/projected/3d771753-bad2-43d2-b87e-3f1dbea976b6-kube-api-access-pclgn\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.495445 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-config-data\") pod \"nova-scheduler-0\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.495495 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzqnr\" (UniqueName: \"kubernetes.io/projected/dd162699-19f0-4808-9b7d-f916d3796d34-kube-api-access-kzqnr\") pod \"nova-scheduler-0\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.495557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-config-data\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.495580 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d771753-bad2-43d2-b87e-3f1dbea976b6-logs\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.495649 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.495714 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.499049 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.501653 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-config-data\") pod \"nova-scheduler-0\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.550466 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzqnr\" (UniqueName: \"kubernetes.io/projected/dd162699-19f0-4808-9b7d-f916d3796d34-kube-api-access-kzqnr\") pod \"nova-scheduler-0\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.588060 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.597462 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-nb\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.597525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh755\" (UniqueName: \"kubernetes.io/projected/31c54c0b-4550-41a7-a851-5aed75eee87e-kube-api-access-dh755\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.597587 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.597707 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-sb\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.597746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pclgn\" (UniqueName: \"kubernetes.io/projected/3d771753-bad2-43d2-b87e-3f1dbea976b6-kube-api-access-pclgn\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.597775 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-config\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.597858 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-dns-svc\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.597901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-config-data\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.597934 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d771753-bad2-43d2-b87e-3f1dbea976b6-logs\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.598483 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d771753-bad2-43d2-b87e-3f1dbea976b6-logs\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.601494 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-config-data\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.602283 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.614869 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pclgn\" (UniqueName: \"kubernetes.io/projected/3d771753-bad2-43d2-b87e-3f1dbea976b6-kube-api-access-pclgn\") pod \"nova-metadata-0\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.665254 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.700588 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.700999 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-sb\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.701081 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-config\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.701133 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-dns-svc\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.701230 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-nb\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.701258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh755\" (UniqueName: \"kubernetes.io/projected/31c54c0b-4550-41a7-a851-5aed75eee87e-kube-api-access-dh755\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.702388 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-sb\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.702388 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-config\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.702724 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-dns-svc\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.703032 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-nb\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.722512 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh755\" (UniqueName: \"kubernetes.io/projected/31c54c0b-4550-41a7-a851-5aed75eee87e-kube-api-access-dh755\") pod \"dnsmasq-dns-69dc7db885-s6l89\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.722526 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.751278 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.852110 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ldxd4"] Oct 06 16:25:12 crc kubenswrapper[4763]: I1006 16:25:12.948406 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ldxd4" event={"ID":"19d391b2-f609-4a0e-857a-5c64ab6168f5","Type":"ContainerStarted","Data":"7eac5834e72c4dd4190176effccb585f3b2d9b73e47daf977905ccf9d1a45ad8"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.048188 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:13 crc kubenswrapper[4763]: W1006 16:25:13.050499 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2c13798_76e6_4850_86ff_535885da0f9d.slice/crio-48bbbe1685d1fa397772ad6d0dee8f6f9d81c87e637eedeba32042f59c12933d WatchSource:0}: Error finding container 48bbbe1685d1fa397772ad6d0dee8f6f9d81c87e637eedeba32042f59c12933d: Status 404 returned error can't find the container with id 48bbbe1685d1fa397772ad6d0dee8f6f9d81c87e637eedeba32042f59c12933d Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.226985 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.245844 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.255475 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.381000 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-s6l89"] Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.441258 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-964hw"] Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.442495 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.444473 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.444999 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.449419 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-964hw"] Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.516177 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.516212 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-config-data\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.516229 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt28d\" (UniqueName: \"kubernetes.io/projected/2d34157a-16c1-44e4-b187-8b1294ac635b-kube-api-access-bt28d\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.516269 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-scripts\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.618164 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.618216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-config-data\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.618238 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt28d\" (UniqueName: \"kubernetes.io/projected/2d34157a-16c1-44e4-b187-8b1294ac635b-kube-api-access-bt28d\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.618287 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-scripts\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.629556 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-scripts\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.636909 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-config-data\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.658341 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt28d\" (UniqueName: \"kubernetes.io/projected/2d34157a-16c1-44e4-b187-8b1294ac635b-kube-api-access-bt28d\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.663802 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-964hw\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.774353 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.963707 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c13798-76e6-4850-86ff-535885da0f9d","Type":"ContainerStarted","Data":"3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.964062 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c13798-76e6-4850-86ff-535885da0f9d","Type":"ContainerStarted","Data":"ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.964075 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c13798-76e6-4850-86ff-535885da0f9d","Type":"ContainerStarted","Data":"48bbbe1685d1fa397772ad6d0dee8f6f9d81c87e637eedeba32042f59c12933d"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.966118 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d771753-bad2-43d2-b87e-3f1dbea976b6","Type":"ContainerStarted","Data":"241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.966150 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d771753-bad2-43d2-b87e-3f1dbea976b6","Type":"ContainerStarted","Data":"55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.966164 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d771753-bad2-43d2-b87e-3f1dbea976b6","Type":"ContainerStarted","Data":"b4e8f35290430c55c933f7568e6aa4495329c7db9e3caedee902763a0f52440a"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.968887 4763 generic.go:334] "Generic (PLEG): container finished" podID="31c54c0b-4550-41a7-a851-5aed75eee87e" containerID="b947f9d183450207b2e2827dd995f9c8a3e5abe709b10a68a2ff9d80b32661e6" exitCode=0 Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.968968 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" event={"ID":"31c54c0b-4550-41a7-a851-5aed75eee87e","Type":"ContainerDied","Data":"b947f9d183450207b2e2827dd995f9c8a3e5abe709b10a68a2ff9d80b32661e6"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.968991 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" event={"ID":"31c54c0b-4550-41a7-a851-5aed75eee87e","Type":"ContainerStarted","Data":"afd38c41f5cff00421924aec2c475c35c18ca5a3d13bff0e9a73dae0347d06cf"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.981748 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.981732834 podStartE2EDuration="1.981732834s" podCreationTimestamp="2025-10-06 16:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:13.980015158 +0000 UTC m=+5511.135307670" watchObservedRunningTime="2025-10-06 16:25:13.981732834 +0000 UTC m=+5511.137025336" Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.982761 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd162699-19f0-4808-9b7d-f916d3796d34","Type":"ContainerStarted","Data":"38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.983439 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd162699-19f0-4808-9b7d-f916d3796d34","Type":"ContainerStarted","Data":"2f8e04fae3716a5b4692bd1405dffe9acdc4b5a19332bdbb3329866132a97223"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.986033 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ldxd4" event={"ID":"19d391b2-f609-4a0e-857a-5c64ab6168f5","Type":"ContainerStarted","Data":"805c82b71bc72f8ec1d422d3b63b4fdfd281515aa4f3d5d9cf69fb0ce2bd2e10"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.994597 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d856fcef-f614-4291-818c-3f77282f5940","Type":"ContainerStarted","Data":"efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e"} Oct 06 16:25:13 crc kubenswrapper[4763]: I1006 16:25:13.994686 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d856fcef-f614-4291-818c-3f77282f5940","Type":"ContainerStarted","Data":"48587decd952c43bf3bc1bdd1430087e2947358079a04e0bf158498b0b0a5015"} Oct 06 16:25:14 crc kubenswrapper[4763]: I1006 16:25:14.002340 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.002324936 podStartE2EDuration="2.002324936s" podCreationTimestamp="2025-10-06 16:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:13.999159101 +0000 UTC m=+5511.154451623" watchObservedRunningTime="2025-10-06 16:25:14.002324936 +0000 UTC m=+5511.157617448" Oct 06 16:25:14 crc kubenswrapper[4763]: I1006 16:25:14.047692 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-964hw"] Oct 06 16:25:14 crc kubenswrapper[4763]: I1006 16:25:14.064844 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.06482528 podStartE2EDuration="2.06482528s" podCreationTimestamp="2025-10-06 16:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:14.05253711 +0000 UTC m=+5511.207829622" watchObservedRunningTime="2025-10-06 16:25:14.06482528 +0000 UTC m=+5511.220117792" Oct 06 16:25:14 crc kubenswrapper[4763]: I1006 16:25:14.067857 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.06781518 podStartE2EDuration="2.06781518s" podCreationTimestamp="2025-10-06 16:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:14.067597134 +0000 UTC m=+5511.222889646" watchObservedRunningTime="2025-10-06 16:25:14.06781518 +0000 UTC m=+5511.223107682" Oct 06 16:25:14 crc kubenswrapper[4763]: I1006 16:25:14.091999 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-ldxd4" podStartSLOduration=3.091978287 podStartE2EDuration="3.091978287s" podCreationTimestamp="2025-10-06 16:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:14.084773154 +0000 UTC m=+5511.240065666" watchObservedRunningTime="2025-10-06 16:25:14.091978287 +0000 UTC m=+5511.247270799" Oct 06 16:25:15 crc kubenswrapper[4763]: I1006 16:25:15.015113 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-964hw" event={"ID":"2d34157a-16c1-44e4-b187-8b1294ac635b","Type":"ContainerStarted","Data":"d20fa5f94991af9a694579a7c1f5105f1fa8b7f35738e581b42e0420b69b5e37"} Oct 06 16:25:15 crc kubenswrapper[4763]: I1006 16:25:15.015454 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-964hw" event={"ID":"2d34157a-16c1-44e4-b187-8b1294ac635b","Type":"ContainerStarted","Data":"80680c7f4a20fdef34828f000d9a21b1a7e8c591484f7a2528fb964d53846471"} Oct 06 16:25:15 crc kubenswrapper[4763]: I1006 16:25:15.026952 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" event={"ID":"31c54c0b-4550-41a7-a851-5aed75eee87e","Type":"ContainerStarted","Data":"179a291999c36f551433be402096687932bbb292cc0bfb73cd3377bec7f81ef1"} Oct 06 16:25:15 crc kubenswrapper[4763]: I1006 16:25:15.052933 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-964hw" podStartSLOduration=2.052905573 podStartE2EDuration="2.052905573s" podCreationTimestamp="2025-10-06 16:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:15.051838165 +0000 UTC m=+5512.207130667" watchObservedRunningTime="2025-10-06 16:25:15.052905573 +0000 UTC m=+5512.208198115" Oct 06 16:25:15 crc kubenswrapper[4763]: I1006 16:25:15.084220 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" podStartSLOduration=3.084198572 podStartE2EDuration="3.084198572s" podCreationTimestamp="2025-10-06 16:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:15.077173783 +0000 UTC m=+5512.232466315" watchObservedRunningTime="2025-10-06 16:25:15.084198572 +0000 UTC m=+5512.239491084" Oct 06 16:25:16 crc kubenswrapper[4763]: I1006 16:25:16.034866 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:17 crc kubenswrapper[4763]: I1006 16:25:17.665711 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:17 crc kubenswrapper[4763]: I1006 16:25:17.702009 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 16:25:17 crc kubenswrapper[4763]: I1006 16:25:17.723695 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 16:25:17 crc kubenswrapper[4763]: I1006 16:25:17.723854 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 16:25:18 crc kubenswrapper[4763]: I1006 16:25:18.058796 4763 generic.go:334] "Generic (PLEG): container finished" podID="19d391b2-f609-4a0e-857a-5c64ab6168f5" containerID="805c82b71bc72f8ec1d422d3b63b4fdfd281515aa4f3d5d9cf69fb0ce2bd2e10" exitCode=0 Oct 06 16:25:18 crc kubenswrapper[4763]: I1006 16:25:18.058851 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ldxd4" event={"ID":"19d391b2-f609-4a0e-857a-5c64ab6168f5","Type":"ContainerDied","Data":"805c82b71bc72f8ec1d422d3b63b4fdfd281515aa4f3d5d9cf69fb0ce2bd2e10"} Oct 06 16:25:18 crc kubenswrapper[4763]: I1006 16:25:18.061178 4763 generic.go:334] "Generic (PLEG): container finished" podID="2d34157a-16c1-44e4-b187-8b1294ac635b" containerID="d20fa5f94991af9a694579a7c1f5105f1fa8b7f35738e581b42e0420b69b5e37" exitCode=0 Oct 06 16:25:18 crc kubenswrapper[4763]: I1006 16:25:18.061220 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-964hw" event={"ID":"2d34157a-16c1-44e4-b187-8b1294ac635b","Type":"ContainerDied","Data":"d20fa5f94991af9a694579a7c1f5105f1fa8b7f35738e581b42e0420b69b5e37"} Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.394206 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.496897 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.562213 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-combined-ca-bundle\") pod \"19d391b2-f609-4a0e-857a-5c64ab6168f5\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.562307 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-config-data\") pod \"19d391b2-f609-4a0e-857a-5c64ab6168f5\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.562363 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-scripts\") pod \"19d391b2-f609-4a0e-857a-5c64ab6168f5\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.562392 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhlmh\" (UniqueName: \"kubernetes.io/projected/19d391b2-f609-4a0e-857a-5c64ab6168f5-kube-api-access-bhlmh\") pod \"19d391b2-f609-4a0e-857a-5c64ab6168f5\" (UID: \"19d391b2-f609-4a0e-857a-5c64ab6168f5\") " Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.568899 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d391b2-f609-4a0e-857a-5c64ab6168f5-kube-api-access-bhlmh" (OuterVolumeSpecName: "kube-api-access-bhlmh") pod "19d391b2-f609-4a0e-857a-5c64ab6168f5" (UID: "19d391b2-f609-4a0e-857a-5c64ab6168f5"). InnerVolumeSpecName "kube-api-access-bhlmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.579333 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-scripts" (OuterVolumeSpecName: "scripts") pod "19d391b2-f609-4a0e-857a-5c64ab6168f5" (UID: "19d391b2-f609-4a0e-857a-5c64ab6168f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.590886 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19d391b2-f609-4a0e-857a-5c64ab6168f5" (UID: "19d391b2-f609-4a0e-857a-5c64ab6168f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.591122 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-config-data" (OuterVolumeSpecName: "config-data") pod "19d391b2-f609-4a0e-857a-5c64ab6168f5" (UID: "19d391b2-f609-4a0e-857a-5c64ab6168f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.663910 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-config-data\") pod \"2d34157a-16c1-44e4-b187-8b1294ac635b\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.663969 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-combined-ca-bundle\") pod \"2d34157a-16c1-44e4-b187-8b1294ac635b\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.664004 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-scripts\") pod \"2d34157a-16c1-44e4-b187-8b1294ac635b\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.664073 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt28d\" (UniqueName: \"kubernetes.io/projected/2d34157a-16c1-44e4-b187-8b1294ac635b-kube-api-access-bt28d\") pod \"2d34157a-16c1-44e4-b187-8b1294ac635b\" (UID: \"2d34157a-16c1-44e4-b187-8b1294ac635b\") " Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.664506 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.664526 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.664537 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d391b2-f609-4a0e-857a-5c64ab6168f5-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.664548 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhlmh\" (UniqueName: \"kubernetes.io/projected/19d391b2-f609-4a0e-857a-5c64ab6168f5-kube-api-access-bhlmh\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.667379 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-scripts" (OuterVolumeSpecName: "scripts") pod "2d34157a-16c1-44e4-b187-8b1294ac635b" (UID: "2d34157a-16c1-44e4-b187-8b1294ac635b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.667561 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d34157a-16c1-44e4-b187-8b1294ac635b-kube-api-access-bt28d" (OuterVolumeSpecName: "kube-api-access-bt28d") pod "2d34157a-16c1-44e4-b187-8b1294ac635b" (UID: "2d34157a-16c1-44e4-b187-8b1294ac635b"). InnerVolumeSpecName "kube-api-access-bt28d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.690458 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d34157a-16c1-44e4-b187-8b1294ac635b" (UID: "2d34157a-16c1-44e4-b187-8b1294ac635b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.697667 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-config-data" (OuterVolumeSpecName: "config-data") pod "2d34157a-16c1-44e4-b187-8b1294ac635b" (UID: "2d34157a-16c1-44e4-b187-8b1294ac635b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.766580 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt28d\" (UniqueName: \"kubernetes.io/projected/2d34157a-16c1-44e4-b187-8b1294ac635b-kube-api-access-bt28d\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.766671 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.766693 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:19 crc kubenswrapper[4763]: I1006 16:25:19.766709 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d34157a-16c1-44e4-b187-8b1294ac635b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.088120 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ldxd4" event={"ID":"19d391b2-f609-4a0e-857a-5c64ab6168f5","Type":"ContainerDied","Data":"7eac5834e72c4dd4190176effccb585f3b2d9b73e47daf977905ccf9d1a45ad8"} Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.088185 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eac5834e72c4dd4190176effccb585f3b2d9b73e47daf977905ccf9d1a45ad8" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.088149 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ldxd4" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.095845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-964hw" event={"ID":"2d34157a-16c1-44e4-b187-8b1294ac635b","Type":"ContainerDied","Data":"80680c7f4a20fdef34828f000d9a21b1a7e8c591484f7a2528fb964d53846471"} Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.095893 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80680c7f4a20fdef34828f000d9a21b1a7e8c591484f7a2528fb964d53846471" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.095977 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-964hw" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.195824 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 16:25:20 crc kubenswrapper[4763]: E1006 16:25:20.196996 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d34157a-16c1-44e4-b187-8b1294ac635b" containerName="nova-cell1-conductor-db-sync" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.197017 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d34157a-16c1-44e4-b187-8b1294ac635b" containerName="nova-cell1-conductor-db-sync" Oct 06 16:25:20 crc kubenswrapper[4763]: E1006 16:25:20.197099 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d391b2-f609-4a0e-857a-5c64ab6168f5" containerName="nova-manage" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.197117 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d391b2-f609-4a0e-857a-5c64ab6168f5" containerName="nova-manage" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.197655 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d34157a-16c1-44e4-b187-8b1294ac635b" containerName="nova-cell1-conductor-db-sync" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.197684 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d391b2-f609-4a0e-857a-5c64ab6168f5" containerName="nova-manage" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.198889 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.230753 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.234836 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.277041 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfhv7\" (UniqueName: \"kubernetes.io/projected/f85f902d-6b8b-4847-8581-b6a42fcc875e-kube-api-access-dfhv7\") pod \"nova-cell1-conductor-0\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.277124 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.277154 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.366364 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.366798 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2c13798-76e6-4850-86ff-535885da0f9d" containerName="nova-api-log" containerID="cri-o://ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4" gracePeriod=30 Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.367371 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2c13798-76e6-4850-86ff-535885da0f9d" containerName="nova-api-api" containerID="cri-o://3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076" gracePeriod=30 Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.378639 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.379158 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfhv7\" (UniqueName: \"kubernetes.io/projected/f85f902d-6b8b-4847-8581-b6a42fcc875e-kube-api-access-dfhv7\") pod \"nova-cell1-conductor-0\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.379280 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.384750 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.391030 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.416431 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfhv7\" (UniqueName: \"kubernetes.io/projected/f85f902d-6b8b-4847-8581-b6a42fcc875e-kube-api-access-dfhv7\") pod \"nova-cell1-conductor-0\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.419241 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.419492 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3d771753-bad2-43d2-b87e-3f1dbea976b6" containerName="nova-metadata-log" containerID="cri-o://55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f" gracePeriod=30 Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.419578 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3d771753-bad2-43d2-b87e-3f1dbea976b6" containerName="nova-metadata-metadata" containerID="cri-o://241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b" gracePeriod=30 Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.434149 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.434383 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dd162699-19f0-4808-9b7d-f916d3796d34" containerName="nova-scheduler-scheduler" containerID="cri-o://38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72" gracePeriod=30 Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.531333 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.995499 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:25:20 crc kubenswrapper[4763]: I1006 16:25:20.998152 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 16:25:21 crc kubenswrapper[4763]: W1006 16:25:21.017535 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf85f902d_6b8b_4847_8581_b6a42fcc875e.slice/crio-b05f6ad18b988fe29e1fed5823d12c7ad57953663034760528c1af04b3ef167f WatchSource:0}: Error finding container b05f6ad18b988fe29e1fed5823d12c7ad57953663034760528c1af04b3ef167f: Status 404 returned error can't find the container with id b05f6ad18b988fe29e1fed5823d12c7ad57953663034760528c1af04b3ef167f Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.044583 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.093295 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c13798-76e6-4850-86ff-535885da0f9d-logs\") pod \"e2c13798-76e6-4850-86ff-535885da0f9d\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.093406 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knfmv\" (UniqueName: \"kubernetes.io/projected/e2c13798-76e6-4850-86ff-535885da0f9d-kube-api-access-knfmv\") pod \"e2c13798-76e6-4850-86ff-535885da0f9d\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.093436 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-config-data\") pod \"e2c13798-76e6-4850-86ff-535885da0f9d\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.093547 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-combined-ca-bundle\") pod \"e2c13798-76e6-4850-86ff-535885da0f9d\" (UID: \"e2c13798-76e6-4850-86ff-535885da0f9d\") " Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.095359 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2c13798-76e6-4850-86ff-535885da0f9d-logs" (OuterVolumeSpecName: "logs") pod "e2c13798-76e6-4850-86ff-535885da0f9d" (UID: "e2c13798-76e6-4850-86ff-535885da0f9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.097608 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c13798-76e6-4850-86ff-535885da0f9d-kube-api-access-knfmv" (OuterVolumeSpecName: "kube-api-access-knfmv") pod "e2c13798-76e6-4850-86ff-535885da0f9d" (UID: "e2c13798-76e6-4850-86ff-535885da0f9d"). InnerVolumeSpecName "kube-api-access-knfmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.109527 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f85f902d-6b8b-4847-8581-b6a42fcc875e","Type":"ContainerStarted","Data":"b05f6ad18b988fe29e1fed5823d12c7ad57953663034760528c1af04b3ef167f"} Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.112487 4763 generic.go:334] "Generic (PLEG): container finished" podID="e2c13798-76e6-4850-86ff-535885da0f9d" containerID="3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076" exitCode=0 Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.112511 4763 generic.go:334] "Generic (PLEG): container finished" podID="e2c13798-76e6-4850-86ff-535885da0f9d" containerID="ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4" exitCode=143 Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.112542 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c13798-76e6-4850-86ff-535885da0f9d","Type":"ContainerDied","Data":"3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076"} Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.112559 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c13798-76e6-4850-86ff-535885da0f9d","Type":"ContainerDied","Data":"ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4"} Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.112568 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c13798-76e6-4850-86ff-535885da0f9d","Type":"ContainerDied","Data":"48bbbe1685d1fa397772ad6d0dee8f6f9d81c87e637eedeba32042f59c12933d"} Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.112585 4763 scope.go:117] "RemoveContainer" containerID="3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.112717 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.116979 4763 generic.go:334] "Generic (PLEG): container finished" podID="3d771753-bad2-43d2-b87e-3f1dbea976b6" containerID="241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b" exitCode=0 Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.117062 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.117130 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d771753-bad2-43d2-b87e-3f1dbea976b6","Type":"ContainerDied","Data":"241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b"} Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.117191 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d771753-bad2-43d2-b87e-3f1dbea976b6","Type":"ContainerDied","Data":"55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f"} Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.117373 4763 generic.go:334] "Generic (PLEG): container finished" podID="3d771753-bad2-43d2-b87e-3f1dbea976b6" containerID="55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f" exitCode=143 Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.117402 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d771753-bad2-43d2-b87e-3f1dbea976b6","Type":"ContainerDied","Data":"b4e8f35290430c55c933f7568e6aa4495329c7db9e3caedee902763a0f52440a"} Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.125753 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2c13798-76e6-4850-86ff-535885da0f9d" (UID: "e2c13798-76e6-4850-86ff-535885da0f9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.126540 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-config-data" (OuterVolumeSpecName: "config-data") pod "e2c13798-76e6-4850-86ff-535885da0f9d" (UID: "e2c13798-76e6-4850-86ff-535885da0f9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.138975 4763 scope.go:117] "RemoveContainer" containerID="ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.165980 4763 scope.go:117] "RemoveContainer" containerID="3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076" Oct 06 16:25:21 crc kubenswrapper[4763]: E1006 16:25:21.166422 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076\": container with ID starting with 3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076 not found: ID does not exist" containerID="3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.166461 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076"} err="failed to get container status \"3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076\": rpc error: code = NotFound desc = could not find container \"3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076\": container with ID starting with 3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076 not found: ID does not exist" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.166488 4763 scope.go:117] "RemoveContainer" containerID="ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4" Oct 06 16:25:21 crc kubenswrapper[4763]: E1006 16:25:21.166825 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4\": container with ID starting with ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4 not found: ID does not exist" containerID="ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.166847 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4"} err="failed to get container status \"ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4\": rpc error: code = NotFound desc = could not find container \"ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4\": container with ID starting with ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4 not found: ID does not exist" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.166859 4763 scope.go:117] "RemoveContainer" containerID="3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.167094 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076"} err="failed to get container status \"3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076\": rpc error: code = NotFound desc = could not find container \"3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076\": container with ID starting with 3214d27a6a8f5047a5405ee84785d103d6e402efe0116a68224d74339fe7b076 not found: ID does not exist" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.167133 4763 scope.go:117] "RemoveContainer" containerID="ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.167401 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4"} err="failed to get container status \"ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4\": rpc error: code = NotFound desc = could not find container \"ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4\": container with ID starting with ecd03e1227461f5804b256e7215e0670ea5f3b409ac14865befb62d2ffb802d4 not found: ID does not exist" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.167419 4763 scope.go:117] "RemoveContainer" containerID="241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.187714 4763 scope.go:117] "RemoveContainer" containerID="55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.194970 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d771753-bad2-43d2-b87e-3f1dbea976b6-logs\") pod \"3d771753-bad2-43d2-b87e-3f1dbea976b6\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.195024 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pclgn\" (UniqueName: \"kubernetes.io/projected/3d771753-bad2-43d2-b87e-3f1dbea976b6-kube-api-access-pclgn\") pod \"3d771753-bad2-43d2-b87e-3f1dbea976b6\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.195069 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-combined-ca-bundle\") pod \"3d771753-bad2-43d2-b87e-3f1dbea976b6\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.195101 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-config-data\") pod \"3d771753-bad2-43d2-b87e-3f1dbea976b6\" (UID: \"3d771753-bad2-43d2-b87e-3f1dbea976b6\") " Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.195284 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d771753-bad2-43d2-b87e-3f1dbea976b6-logs" (OuterVolumeSpecName: "logs") pod "3d771753-bad2-43d2-b87e-3f1dbea976b6" (UID: "3d771753-bad2-43d2-b87e-3f1dbea976b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.195541 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knfmv\" (UniqueName: \"kubernetes.io/projected/e2c13798-76e6-4850-86ff-535885da0f9d-kube-api-access-knfmv\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.195559 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.195569 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c13798-76e6-4850-86ff-535885da0f9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.195580 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d771753-bad2-43d2-b87e-3f1dbea976b6-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.195589 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c13798-76e6-4850-86ff-535885da0f9d-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.198556 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d771753-bad2-43d2-b87e-3f1dbea976b6-kube-api-access-pclgn" (OuterVolumeSpecName: "kube-api-access-pclgn") pod "3d771753-bad2-43d2-b87e-3f1dbea976b6" (UID: "3d771753-bad2-43d2-b87e-3f1dbea976b6"). InnerVolumeSpecName "kube-api-access-pclgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.206075 4763 scope.go:117] "RemoveContainer" containerID="241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b" Oct 06 16:25:21 crc kubenswrapper[4763]: E1006 16:25:21.206698 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b\": container with ID starting with 241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b not found: ID does not exist" containerID="241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.206748 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b"} err="failed to get container status \"241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b\": rpc error: code = NotFound desc = could not find container \"241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b\": container with ID starting with 241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b not found: ID does not exist" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.206773 4763 scope.go:117] "RemoveContainer" containerID="55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f" Oct 06 16:25:21 crc kubenswrapper[4763]: E1006 16:25:21.207230 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f\": container with ID starting with 55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f not found: ID does not exist" containerID="55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.207265 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f"} err="failed to get container status \"55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f\": rpc error: code = NotFound desc = could not find container \"55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f\": container with ID starting with 55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f not found: ID does not exist" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.207278 4763 scope.go:117] "RemoveContainer" containerID="241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.207641 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b"} err="failed to get container status \"241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b\": rpc error: code = NotFound desc = could not find container \"241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b\": container with ID starting with 241ff99d79c8434d95bda3c8548e75c902743a568376b348106ea9318985a63b not found: ID does not exist" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.207655 4763 scope.go:117] "RemoveContainer" containerID="55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.208164 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f"} err="failed to get container status \"55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f\": rpc error: code = NotFound desc = could not find container \"55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f\": container with ID starting with 55e757ead069812f62c6004598d8ba45d29a630588215cfec94f9ac3a1fd3b9f not found: ID does not exist" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.226823 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d771753-bad2-43d2-b87e-3f1dbea976b6" (UID: "3d771753-bad2-43d2-b87e-3f1dbea976b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.241826 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-config-data" (OuterVolumeSpecName: "config-data") pod "3d771753-bad2-43d2-b87e-3f1dbea976b6" (UID: "3d771753-bad2-43d2-b87e-3f1dbea976b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.296783 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pclgn\" (UniqueName: \"kubernetes.io/projected/3d771753-bad2-43d2-b87e-3f1dbea976b6-kube-api-access-pclgn\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.297034 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.297045 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d771753-bad2-43d2-b87e-3f1dbea976b6-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.486772 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.510048 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.520489 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:21 crc kubenswrapper[4763]: E1006 16:25:21.521010 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d771753-bad2-43d2-b87e-3f1dbea976b6" containerName="nova-metadata-metadata" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.521032 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d771753-bad2-43d2-b87e-3f1dbea976b6" containerName="nova-metadata-metadata" Oct 06 16:25:21 crc kubenswrapper[4763]: E1006 16:25:21.521081 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d771753-bad2-43d2-b87e-3f1dbea976b6" containerName="nova-metadata-log" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.521092 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d771753-bad2-43d2-b87e-3f1dbea976b6" containerName="nova-metadata-log" Oct 06 16:25:21 crc kubenswrapper[4763]: E1006 16:25:21.521113 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c13798-76e6-4850-86ff-535885da0f9d" containerName="nova-api-api" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.521127 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c13798-76e6-4850-86ff-535885da0f9d" containerName="nova-api-api" Oct 06 16:25:21 crc kubenswrapper[4763]: E1006 16:25:21.521141 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c13798-76e6-4850-86ff-535885da0f9d" containerName="nova-api-log" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.521151 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c13798-76e6-4850-86ff-535885da0f9d" containerName="nova-api-log" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.521374 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c13798-76e6-4850-86ff-535885da0f9d" containerName="nova-api-api" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.521394 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d771753-bad2-43d2-b87e-3f1dbea976b6" containerName="nova-metadata-metadata" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.521414 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d771753-bad2-43d2-b87e-3f1dbea976b6" containerName="nova-metadata-log" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.521431 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c13798-76e6-4850-86ff-535885da0f9d" containerName="nova-api-log" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.522698 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.525996 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.529548 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.543156 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.554595 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.563575 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.565278 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.569434 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.597782 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d771753-bad2-43d2-b87e-3f1dbea976b6" path="/var/lib/kubelet/pods/3d771753-bad2-43d2-b87e-3f1dbea976b6/volumes" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.599389 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c13798-76e6-4850-86ff-535885da0f9d" path="/var/lib/kubelet/pods/e2c13798-76e6-4850-86ff-535885da0f9d/volumes" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.600780 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.601300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-config-data\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.601384 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n699\" (UniqueName: \"kubernetes.io/projected/51eeac86-7412-461e-86a1-a0e3e9b23587-kube-api-access-7n699\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.601461 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51eeac86-7412-461e-86a1-a0e3e9b23587-logs\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.601513 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.703278 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-config-data\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.703661 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51eeac86-7412-461e-86a1-a0e3e9b23587-logs\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.703753 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.703867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.704074 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf4827f7-2668-49a6-8ccf-4b92421856dc-logs\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.704266 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51eeac86-7412-461e-86a1-a0e3e9b23587-logs\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.704336 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-config-data\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.704773 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hncqf\" (UniqueName: \"kubernetes.io/projected/cf4827f7-2668-49a6-8ccf-4b92421856dc-kube-api-access-hncqf\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.704842 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n699\" (UniqueName: \"kubernetes.io/projected/51eeac86-7412-461e-86a1-a0e3e9b23587-kube-api-access-7n699\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.708310 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.715446 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-config-data\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.734795 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n699\" (UniqueName: \"kubernetes.io/projected/51eeac86-7412-461e-86a1-a0e3e9b23587-kube-api-access-7n699\") pod \"nova-api-0\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.806686 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.806809 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf4827f7-2668-49a6-8ccf-4b92421856dc-logs\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.806880 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hncqf\" (UniqueName: \"kubernetes.io/projected/cf4827f7-2668-49a6-8ccf-4b92421856dc-kube-api-access-hncqf\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.806931 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-config-data\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.808125 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf4827f7-2668-49a6-8ccf-4b92421856dc-logs\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.811535 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-config-data\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.815881 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.824587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hncqf\" (UniqueName: \"kubernetes.io/projected/cf4827f7-2668-49a6-8ccf-4b92421856dc-kube-api-access-hncqf\") pod \"nova-metadata-0\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " pod="openstack/nova-metadata-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.857566 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:25:21 crc kubenswrapper[4763]: I1006 16:25:21.883945 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:25:22 crc kubenswrapper[4763]: I1006 16:25:22.133670 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f85f902d-6b8b-4847-8581-b6a42fcc875e","Type":"ContainerStarted","Data":"16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa"} Oct 06 16:25:22 crc kubenswrapper[4763]: I1006 16:25:22.134344 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:22 crc kubenswrapper[4763]: I1006 16:25:22.198089 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.198067462 podStartE2EDuration="2.198067462s" podCreationTimestamp="2025-10-06 16:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:22.19055451 +0000 UTC m=+5519.345847042" watchObservedRunningTime="2025-10-06 16:25:22.198067462 +0000 UTC m=+5519.353359974" Oct 06 16:25:22 crc kubenswrapper[4763]: I1006 16:25:22.347032 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:22 crc kubenswrapper[4763]: W1006 16:25:22.353371 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf4827f7_2668_49a6_8ccf_4b92421856dc.slice/crio-d7642857a0a1fa92576e93bb4dc1ea4f0f639168f4d6827845900a568be91fb6 WatchSource:0}: Error finding container d7642857a0a1fa92576e93bb4dc1ea4f0f639168f4d6827845900a568be91fb6: Status 404 returned error can't find the container with id d7642857a0a1fa92576e93bb4dc1ea4f0f639168f4d6827845900a568be91fb6 Oct 06 16:25:22 crc kubenswrapper[4763]: I1006 16:25:22.353890 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:22 crc kubenswrapper[4763]: W1006 16:25:22.356614 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51eeac86_7412_461e_86a1_a0e3e9b23587.slice/crio-7354c8099749f2a212abdf82993b11686f994a43217e0090a0dead84153deb6e WatchSource:0}: Error finding container 7354c8099749f2a212abdf82993b11686f994a43217e0090a0dead84153deb6e: Status 404 returned error can't find the container with id 7354c8099749f2a212abdf82993b11686f994a43217e0090a0dead84153deb6e Oct 06 16:25:22 crc kubenswrapper[4763]: I1006 16:25:22.667181 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:22 crc kubenswrapper[4763]: I1006 16:25:22.678596 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:22 crc kubenswrapper[4763]: I1006 16:25:22.766452 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:25:22 crc kubenswrapper[4763]: I1006 16:25:22.845721 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-7cf27"] Oct 06 16:25:22 crc kubenswrapper[4763]: I1006 16:25:22.845944 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" podUID="158506cf-4e2c-442b-9695-2764c2d9e7e2" containerName="dnsmasq-dns" containerID="cri-o://584f180fbc1c5e63814bc310c17247662ac6e5f06a577ce2b87c06cea316720c" gracePeriod=10 Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.155204 4763 generic.go:334] "Generic (PLEG): container finished" podID="158506cf-4e2c-442b-9695-2764c2d9e7e2" containerID="584f180fbc1c5e63814bc310c17247662ac6e5f06a577ce2b87c06cea316720c" exitCode=0 Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.155407 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" event={"ID":"158506cf-4e2c-442b-9695-2764c2d9e7e2","Type":"ContainerDied","Data":"584f180fbc1c5e63814bc310c17247662ac6e5f06a577ce2b87c06cea316720c"} Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.158652 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf4827f7-2668-49a6-8ccf-4b92421856dc","Type":"ContainerStarted","Data":"54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3"} Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.158708 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf4827f7-2668-49a6-8ccf-4b92421856dc","Type":"ContainerStarted","Data":"9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d"} Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.158720 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf4827f7-2668-49a6-8ccf-4b92421856dc","Type":"ContainerStarted","Data":"d7642857a0a1fa92576e93bb4dc1ea4f0f639168f4d6827845900a568be91fb6"} Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.163490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51eeac86-7412-461e-86a1-a0e3e9b23587","Type":"ContainerStarted","Data":"17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84"} Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.163524 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51eeac86-7412-461e-86a1-a0e3e9b23587","Type":"ContainerStarted","Data":"b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265"} Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.163535 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51eeac86-7412-461e-86a1-a0e3e9b23587","Type":"ContainerStarted","Data":"7354c8099749f2a212abdf82993b11686f994a43217e0090a0dead84153deb6e"} Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.172320 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.181058 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.181034868 podStartE2EDuration="2.181034868s" podCreationTimestamp="2025-10-06 16:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:23.175008157 +0000 UTC m=+5520.330300669" watchObservedRunningTime="2025-10-06 16:25:23.181034868 +0000 UTC m=+5520.336327390" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.229254 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.22923297 podStartE2EDuration="2.22923297s" podCreationTimestamp="2025-10-06 16:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:23.209283955 +0000 UTC m=+5520.364576487" watchObservedRunningTime="2025-10-06 16:25:23.22923297 +0000 UTC m=+5520.384525482" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.398890 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.542898 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-nb\") pod \"158506cf-4e2c-442b-9695-2764c2d9e7e2\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.543004 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-sb\") pod \"158506cf-4e2c-442b-9695-2764c2d9e7e2\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.543045 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-dns-svc\") pod \"158506cf-4e2c-442b-9695-2764c2d9e7e2\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.543165 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxxll\" (UniqueName: \"kubernetes.io/projected/158506cf-4e2c-442b-9695-2764c2d9e7e2-kube-api-access-xxxll\") pod \"158506cf-4e2c-442b-9695-2764c2d9e7e2\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.543273 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-config\") pod \"158506cf-4e2c-442b-9695-2764c2d9e7e2\" (UID: \"158506cf-4e2c-442b-9695-2764c2d9e7e2\") " Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.547787 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158506cf-4e2c-442b-9695-2764c2d9e7e2-kube-api-access-xxxll" (OuterVolumeSpecName: "kube-api-access-xxxll") pod "158506cf-4e2c-442b-9695-2764c2d9e7e2" (UID: "158506cf-4e2c-442b-9695-2764c2d9e7e2"). InnerVolumeSpecName "kube-api-access-xxxll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.582336 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:25:23 crc kubenswrapper[4763]: E1006 16:25:23.582953 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.590770 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "158506cf-4e2c-442b-9695-2764c2d9e7e2" (UID: "158506cf-4e2c-442b-9695-2764c2d9e7e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.597104 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-config" (OuterVolumeSpecName: "config") pod "158506cf-4e2c-442b-9695-2764c2d9e7e2" (UID: "158506cf-4e2c-442b-9695-2764c2d9e7e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.618872 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "158506cf-4e2c-442b-9695-2764c2d9e7e2" (UID: "158506cf-4e2c-442b-9695-2764c2d9e7e2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.618900 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "158506cf-4e2c-442b-9695-2764c2d9e7e2" (UID: "158506cf-4e2c-442b-9695-2764c2d9e7e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.645194 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxxll\" (UniqueName: \"kubernetes.io/projected/158506cf-4e2c-442b-9695-2764c2d9e7e2-kube-api-access-xxxll\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.645223 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.645232 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.645242 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:23 crc kubenswrapper[4763]: I1006 16:25:23.645250 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/158506cf-4e2c-442b-9695-2764c2d9e7e2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:24 crc kubenswrapper[4763]: I1006 16:25:24.172459 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" Oct 06 16:25:24 crc kubenswrapper[4763]: I1006 16:25:24.172487 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c649d7bf-7cf27" event={"ID":"158506cf-4e2c-442b-9695-2764c2d9e7e2","Type":"ContainerDied","Data":"77da69b93cefc84e9b281c49f4d21984454cb1cf38702104f520c6269670ff99"} Oct 06 16:25:24 crc kubenswrapper[4763]: I1006 16:25:24.172571 4763 scope.go:117] "RemoveContainer" containerID="584f180fbc1c5e63814bc310c17247662ac6e5f06a577ce2b87c06cea316720c" Oct 06 16:25:24 crc kubenswrapper[4763]: I1006 16:25:24.215813 4763 scope.go:117] "RemoveContainer" containerID="e27e159f45dcae7b83ee1f56ba01d551188c3aa5b8153263501f760d0adca987" Oct 06 16:25:24 crc kubenswrapper[4763]: I1006 16:25:24.215993 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-7cf27"] Oct 06 16:25:24 crc kubenswrapper[4763]: I1006 16:25:24.227497 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-7cf27"] Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.181758 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.187440 4763 generic.go:334] "Generic (PLEG): container finished" podID="dd162699-19f0-4808-9b7d-f916d3796d34" containerID="38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72" exitCode=0 Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.187479 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd162699-19f0-4808-9b7d-f916d3796d34","Type":"ContainerDied","Data":"38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72"} Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.187506 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd162699-19f0-4808-9b7d-f916d3796d34","Type":"ContainerDied","Data":"2f8e04fae3716a5b4692bd1405dffe9acdc4b5a19332bdbb3329866132a97223"} Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.187523 4763 scope.go:117] "RemoveContainer" containerID="38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.187523 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.213724 4763 scope.go:117] "RemoveContainer" containerID="38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72" Oct 06 16:25:25 crc kubenswrapper[4763]: E1006 16:25:25.214345 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72\": container with ID starting with 38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72 not found: ID does not exist" containerID="38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.214377 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72"} err="failed to get container status \"38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72\": rpc error: code = NotFound desc = could not find container \"38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72\": container with ID starting with 38c65b87d3ae0bf3db44648952d4be5b916927824fa8281d3de985e75a11cd72 not found: ID does not exist" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.374019 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-config-data\") pod \"dd162699-19f0-4808-9b7d-f916d3796d34\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.374130 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzqnr\" (UniqueName: \"kubernetes.io/projected/dd162699-19f0-4808-9b7d-f916d3796d34-kube-api-access-kzqnr\") pod \"dd162699-19f0-4808-9b7d-f916d3796d34\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.374282 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-combined-ca-bundle\") pod \"dd162699-19f0-4808-9b7d-f916d3796d34\" (UID: \"dd162699-19f0-4808-9b7d-f916d3796d34\") " Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.380298 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd162699-19f0-4808-9b7d-f916d3796d34-kube-api-access-kzqnr" (OuterVolumeSpecName: "kube-api-access-kzqnr") pod "dd162699-19f0-4808-9b7d-f916d3796d34" (UID: "dd162699-19f0-4808-9b7d-f916d3796d34"). InnerVolumeSpecName "kube-api-access-kzqnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.398443 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-config-data" (OuterVolumeSpecName: "config-data") pod "dd162699-19f0-4808-9b7d-f916d3796d34" (UID: "dd162699-19f0-4808-9b7d-f916d3796d34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.404972 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd162699-19f0-4808-9b7d-f916d3796d34" (UID: "dd162699-19f0-4808-9b7d-f916d3796d34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.476876 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.476921 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzqnr\" (UniqueName: \"kubernetes.io/projected/dd162699-19f0-4808-9b7d-f916d3796d34-kube-api-access-kzqnr\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.476937 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd162699-19f0-4808-9b7d-f916d3796d34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.534699 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.544146 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.556570 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:25 crc kubenswrapper[4763]: E1006 16:25:25.557160 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd162699-19f0-4808-9b7d-f916d3796d34" containerName="nova-scheduler-scheduler" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.557185 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd162699-19f0-4808-9b7d-f916d3796d34" containerName="nova-scheduler-scheduler" Oct 06 16:25:25 crc kubenswrapper[4763]: E1006 16:25:25.557208 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158506cf-4e2c-442b-9695-2764c2d9e7e2" containerName="init" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.557218 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="158506cf-4e2c-442b-9695-2764c2d9e7e2" containerName="init" Oct 06 16:25:25 crc kubenswrapper[4763]: E1006 16:25:25.557230 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158506cf-4e2c-442b-9695-2764c2d9e7e2" containerName="dnsmasq-dns" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.557240 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="158506cf-4e2c-442b-9695-2764c2d9e7e2" containerName="dnsmasq-dns" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.557462 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="158506cf-4e2c-442b-9695-2764c2d9e7e2" containerName="dnsmasq-dns" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.557498 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd162699-19f0-4808-9b7d-f916d3796d34" containerName="nova-scheduler-scheduler" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.558456 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.560544 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.594927 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158506cf-4e2c-442b-9695-2764c2d9e7e2" path="/var/lib/kubelet/pods/158506cf-4e2c-442b-9695-2764c2d9e7e2/volumes" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.595597 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd162699-19f0-4808-9b7d-f916d3796d34" path="/var/lib/kubelet/pods/dd162699-19f0-4808-9b7d-f916d3796d34/volumes" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.596315 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.680160 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8qw8\" (UniqueName: \"kubernetes.io/projected/f07e2e80-d414-470e-82f9-ab4376fadbe4-kube-api-access-s8qw8\") pod \"nova-scheduler-0\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.680202 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-config-data\") pod \"nova-scheduler-0\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.680565 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.782323 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8qw8\" (UniqueName: \"kubernetes.io/projected/f07e2e80-d414-470e-82f9-ab4376fadbe4-kube-api-access-s8qw8\") pod \"nova-scheduler-0\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.782483 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-config-data\") pod \"nova-scheduler-0\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.782741 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.787343 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.788683 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-config-data\") pod \"nova-scheduler-0\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.802280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8qw8\" (UniqueName: \"kubernetes.io/projected/f07e2e80-d414-470e-82f9-ab4376fadbe4-kube-api-access-s8qw8\") pod \"nova-scheduler-0\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:25 crc kubenswrapper[4763]: I1006 16:25:25.887136 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:25:26 crc kubenswrapper[4763]: I1006 16:25:26.172446 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:26 crc kubenswrapper[4763]: W1006 16:25:26.173687 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf07e2e80_d414_470e_82f9_ab4376fadbe4.slice/crio-6b70e0aa56d9fc1d230b7359328a5b04a430f142b98f86fa791c34ce924d38d5 WatchSource:0}: Error finding container 6b70e0aa56d9fc1d230b7359328a5b04a430f142b98f86fa791c34ce924d38d5: Status 404 returned error can't find the container with id 6b70e0aa56d9fc1d230b7359328a5b04a430f142b98f86fa791c34ce924d38d5 Oct 06 16:25:26 crc kubenswrapper[4763]: I1006 16:25:26.196929 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f07e2e80-d414-470e-82f9-ab4376fadbe4","Type":"ContainerStarted","Data":"6b70e0aa56d9fc1d230b7359328a5b04a430f142b98f86fa791c34ce924d38d5"} Oct 06 16:25:26 crc kubenswrapper[4763]: I1006 16:25:26.884933 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 16:25:26 crc kubenswrapper[4763]: I1006 16:25:26.885289 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 16:25:27 crc kubenswrapper[4763]: I1006 16:25:27.216139 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f07e2e80-d414-470e-82f9-ab4376fadbe4","Type":"ContainerStarted","Data":"9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe"} Oct 06 16:25:27 crc kubenswrapper[4763]: I1006 16:25:27.253994 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.253961263 podStartE2EDuration="2.253961263s" podCreationTimestamp="2025-10-06 16:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:27.234023809 +0000 UTC m=+5524.389316321" watchObservedRunningTime="2025-10-06 16:25:27.253961263 +0000 UTC m=+5524.409253815" Oct 06 16:25:30 crc kubenswrapper[4763]: I1006 16:25:30.562408 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 06 16:25:30 crc kubenswrapper[4763]: I1006 16:25:30.887381 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.106308 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6xhjm"] Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.116986 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6xhjm"] Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.117116 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.121046 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.121485 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.305154 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-config-data\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.305217 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jvpp\" (UniqueName: \"kubernetes.io/projected/d18b8373-b3de-41e7-96c7-958f69e01594-kube-api-access-6jvpp\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.305301 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.305407 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-scripts\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.407355 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-scripts\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.407634 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-config-data\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.407743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jvpp\" (UniqueName: \"kubernetes.io/projected/d18b8373-b3de-41e7-96c7-958f69e01594-kube-api-access-6jvpp\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.407856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.416467 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.416577 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-scripts\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.419364 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-config-data\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.443707 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jvpp\" (UniqueName: \"kubernetes.io/projected/d18b8373-b3de-41e7-96c7-958f69e01594-kube-api-access-6jvpp\") pod \"nova-cell1-cell-mapping-6xhjm\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.444146 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.858482 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.858942 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.869667 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6xhjm"] Oct 06 16:25:31 crc kubenswrapper[4763]: W1006 16:25:31.875962 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd18b8373_b3de_41e7_96c7_958f69e01594.slice/crio-995b888a99ead4bf4670835fc8056a491a834b849a21f9d6b1198284a51e3e21 WatchSource:0}: Error finding container 995b888a99ead4bf4670835fc8056a491a834b849a21f9d6b1198284a51e3e21: Status 404 returned error can't find the container with id 995b888a99ead4bf4670835fc8056a491a834b849a21f9d6b1198284a51e3e21 Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.884904 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 16:25:31 crc kubenswrapper[4763]: I1006 16:25:31.884973 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 16:25:32 crc kubenswrapper[4763]: I1006 16:25:32.273674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6xhjm" event={"ID":"d18b8373-b3de-41e7-96c7-958f69e01594","Type":"ContainerStarted","Data":"4698e2fa70bff7b5347c691d57977185f2cb888b59d8ceedde07630a89787c83"} Oct 06 16:25:32 crc kubenswrapper[4763]: I1006 16:25:32.274002 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6xhjm" event={"ID":"d18b8373-b3de-41e7-96c7-958f69e01594","Type":"ContainerStarted","Data":"995b888a99ead4bf4670835fc8056a491a834b849a21f9d6b1198284a51e3e21"} Oct 06 16:25:32 crc kubenswrapper[4763]: I1006 16:25:32.295813 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6xhjm" podStartSLOduration=1.295770295 podStartE2EDuration="1.295770295s" podCreationTimestamp="2025-10-06 16:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:32.294270804 +0000 UTC m=+5529.449563316" watchObservedRunningTime="2025-10-06 16:25:32.295770295 +0000 UTC m=+5529.451062807" Oct 06 16:25:33 crc kubenswrapper[4763]: I1006 16:25:33.022915 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:25:33 crc kubenswrapper[4763]: I1006 16:25:33.022979 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.68:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:25:33 crc kubenswrapper[4763]: I1006 16:25:33.022937 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.68:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:25:33 crc kubenswrapper[4763]: I1006 16:25:33.023060 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:25:34 crc kubenswrapper[4763]: I1006 16:25:34.575277 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:25:34 crc kubenswrapper[4763]: E1006 16:25:34.575992 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:25:35 crc kubenswrapper[4763]: I1006 16:25:35.887632 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 16:25:35 crc kubenswrapper[4763]: I1006 16:25:35.926405 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 16:25:36 crc kubenswrapper[4763]: I1006 16:25:36.350513 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 16:25:37 crc kubenswrapper[4763]: I1006 16:25:37.322113 4763 generic.go:334] "Generic (PLEG): container finished" podID="d18b8373-b3de-41e7-96c7-958f69e01594" containerID="4698e2fa70bff7b5347c691d57977185f2cb888b59d8ceedde07630a89787c83" exitCode=0 Oct 06 16:25:37 crc kubenswrapper[4763]: I1006 16:25:37.322161 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6xhjm" event={"ID":"d18b8373-b3de-41e7-96c7-958f69e01594","Type":"ContainerDied","Data":"4698e2fa70bff7b5347c691d57977185f2cb888b59d8ceedde07630a89787c83"} Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.727396 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.837883 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-config-data\") pod \"d18b8373-b3de-41e7-96c7-958f69e01594\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.837971 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-scripts\") pod \"d18b8373-b3de-41e7-96c7-958f69e01594\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.838072 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jvpp\" (UniqueName: \"kubernetes.io/projected/d18b8373-b3de-41e7-96c7-958f69e01594-kube-api-access-6jvpp\") pod \"d18b8373-b3de-41e7-96c7-958f69e01594\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.838134 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-combined-ca-bundle\") pod \"d18b8373-b3de-41e7-96c7-958f69e01594\" (UID: \"d18b8373-b3de-41e7-96c7-958f69e01594\") " Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.843675 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18b8373-b3de-41e7-96c7-958f69e01594-kube-api-access-6jvpp" (OuterVolumeSpecName: "kube-api-access-6jvpp") pod "d18b8373-b3de-41e7-96c7-958f69e01594" (UID: "d18b8373-b3de-41e7-96c7-958f69e01594"). InnerVolumeSpecName "kube-api-access-6jvpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.845109 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-scripts" (OuterVolumeSpecName: "scripts") pod "d18b8373-b3de-41e7-96c7-958f69e01594" (UID: "d18b8373-b3de-41e7-96c7-958f69e01594"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.865543 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-config-data" (OuterVolumeSpecName: "config-data") pod "d18b8373-b3de-41e7-96c7-958f69e01594" (UID: "d18b8373-b3de-41e7-96c7-958f69e01594"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.871259 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d18b8373-b3de-41e7-96c7-958f69e01594" (UID: "d18b8373-b3de-41e7-96c7-958f69e01594"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.941566 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.942032 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jvpp\" (UniqueName: \"kubernetes.io/projected/d18b8373-b3de-41e7-96c7-958f69e01594-kube-api-access-6jvpp\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.942060 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:38 crc kubenswrapper[4763]: I1006 16:25:38.942078 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18b8373-b3de-41e7-96c7-958f69e01594-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.348058 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6xhjm" event={"ID":"d18b8373-b3de-41e7-96c7-958f69e01594","Type":"ContainerDied","Data":"995b888a99ead4bf4670835fc8056a491a834b849a21f9d6b1198284a51e3e21"} Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.348277 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="995b888a99ead4bf4670835fc8056a491a834b849a21f9d6b1198284a51e3e21" Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.348189 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6xhjm" Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.531318 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.531550 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerName="nova-api-log" containerID="cri-o://b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265" gracePeriod=30 Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.531705 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerName="nova-api-api" containerID="cri-o://17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84" gracePeriod=30 Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.542405 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.542707 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f07e2e80-d414-470e-82f9-ab4376fadbe4" containerName="nova-scheduler-scheduler" containerID="cri-o://9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe" gracePeriod=30 Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.603427 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.603720 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerName="nova-metadata-log" containerID="cri-o://9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d" gracePeriod=30 Oct 06 16:25:39 crc kubenswrapper[4763]: I1006 16:25:39.603834 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerName="nova-metadata-metadata" containerID="cri-o://54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3" gracePeriod=30 Oct 06 16:25:40 crc kubenswrapper[4763]: I1006 16:25:40.356386 4763 generic.go:334] "Generic (PLEG): container finished" podID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerID="b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265" exitCode=143 Oct 06 16:25:40 crc kubenswrapper[4763]: I1006 16:25:40.356475 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51eeac86-7412-461e-86a1-a0e3e9b23587","Type":"ContainerDied","Data":"b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265"} Oct 06 16:25:40 crc kubenswrapper[4763]: I1006 16:25:40.358771 4763 generic.go:334] "Generic (PLEG): container finished" podID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerID="9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d" exitCode=143 Oct 06 16:25:40 crc kubenswrapper[4763]: I1006 16:25:40.358802 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf4827f7-2668-49a6-8ccf-4b92421856dc","Type":"ContainerDied","Data":"9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d"} Oct 06 16:25:40 crc kubenswrapper[4763]: E1006 16:25:40.889658 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 16:25:40 crc kubenswrapper[4763]: E1006 16:25:40.891786 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 16:25:40 crc kubenswrapper[4763]: E1006 16:25:40.895853 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 16:25:40 crc kubenswrapper[4763]: E1006 16:25:40.895952 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f07e2e80-d414-470e-82f9-ab4376fadbe4" containerName="nova-scheduler-scheduler" Oct 06 16:25:42 crc kubenswrapper[4763]: I1006 16:25:42.897830 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:25:42 crc kubenswrapper[4763]: I1006 16:25:42.926197 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8qw8\" (UniqueName: \"kubernetes.io/projected/f07e2e80-d414-470e-82f9-ab4376fadbe4-kube-api-access-s8qw8\") pod \"f07e2e80-d414-470e-82f9-ab4376fadbe4\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " Oct 06 16:25:42 crc kubenswrapper[4763]: I1006 16:25:42.926257 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-config-data\") pod \"f07e2e80-d414-470e-82f9-ab4376fadbe4\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " Oct 06 16:25:42 crc kubenswrapper[4763]: I1006 16:25:42.926402 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-combined-ca-bundle\") pod \"f07e2e80-d414-470e-82f9-ab4376fadbe4\" (UID: \"f07e2e80-d414-470e-82f9-ab4376fadbe4\") " Oct 06 16:25:42 crc kubenswrapper[4763]: I1006 16:25:42.931933 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07e2e80-d414-470e-82f9-ab4376fadbe4-kube-api-access-s8qw8" (OuterVolumeSpecName: "kube-api-access-s8qw8") pod "f07e2e80-d414-470e-82f9-ab4376fadbe4" (UID: "f07e2e80-d414-470e-82f9-ab4376fadbe4"). InnerVolumeSpecName "kube-api-access-s8qw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:42 crc kubenswrapper[4763]: I1006 16:25:42.952349 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f07e2e80-d414-470e-82f9-ab4376fadbe4" (UID: "f07e2e80-d414-470e-82f9-ab4376fadbe4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:42 crc kubenswrapper[4763]: I1006 16:25:42.954334 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-config-data" (OuterVolumeSpecName: "config-data") pod "f07e2e80-d414-470e-82f9-ab4376fadbe4" (UID: "f07e2e80-d414-470e-82f9-ab4376fadbe4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.028239 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.028281 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8qw8\" (UniqueName: \"kubernetes.io/projected/f07e2e80-d414-470e-82f9-ab4376fadbe4-kube-api-access-s8qw8\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.028293 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07e2e80-d414-470e-82f9-ab4376fadbe4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.029354 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.128809 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-combined-ca-bundle\") pod \"51eeac86-7412-461e-86a1-a0e3e9b23587\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.128864 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n699\" (UniqueName: \"kubernetes.io/projected/51eeac86-7412-461e-86a1-a0e3e9b23587-kube-api-access-7n699\") pod \"51eeac86-7412-461e-86a1-a0e3e9b23587\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.128884 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51eeac86-7412-461e-86a1-a0e3e9b23587-logs\") pod \"51eeac86-7412-461e-86a1-a0e3e9b23587\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.128932 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-config-data\") pod \"51eeac86-7412-461e-86a1-a0e3e9b23587\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.129518 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51eeac86-7412-461e-86a1-a0e3e9b23587-logs" (OuterVolumeSpecName: "logs") pod "51eeac86-7412-461e-86a1-a0e3e9b23587" (UID: "51eeac86-7412-461e-86a1-a0e3e9b23587"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.131898 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51eeac86-7412-461e-86a1-a0e3e9b23587-kube-api-access-7n699" (OuterVolumeSpecName: "kube-api-access-7n699") pod "51eeac86-7412-461e-86a1-a0e3e9b23587" (UID: "51eeac86-7412-461e-86a1-a0e3e9b23587"). InnerVolumeSpecName "kube-api-access-7n699". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.151016 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-config-data podName:51eeac86-7412-461e-86a1-a0e3e9b23587 nodeName:}" failed. No retries permitted until 2025-10-06 16:25:43.650990445 +0000 UTC m=+5540.806282947 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-config-data") pod "51eeac86-7412-461e-86a1-a0e3e9b23587" (UID: "51eeac86-7412-461e-86a1-a0e3e9b23587") : error deleting /var/lib/kubelet/pods/51eeac86-7412-461e-86a1-a0e3e9b23587/volume-subpaths: remove /var/lib/kubelet/pods/51eeac86-7412-461e-86a1-a0e3e9b23587/volume-subpaths: no such file or directory Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.151103 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.153946 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51eeac86-7412-461e-86a1-a0e3e9b23587" (UID: "51eeac86-7412-461e-86a1-a0e3e9b23587"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.230370 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.230647 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n699\" (UniqueName: \"kubernetes.io/projected/51eeac86-7412-461e-86a1-a0e3e9b23587-kube-api-access-7n699\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.230726 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51eeac86-7412-461e-86a1-a0e3e9b23587-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.331911 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-config-data\") pod \"cf4827f7-2668-49a6-8ccf-4b92421856dc\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.331967 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf4827f7-2668-49a6-8ccf-4b92421856dc-logs\") pod \"cf4827f7-2668-49a6-8ccf-4b92421856dc\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.331986 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hncqf\" (UniqueName: \"kubernetes.io/projected/cf4827f7-2668-49a6-8ccf-4b92421856dc-kube-api-access-hncqf\") pod \"cf4827f7-2668-49a6-8ccf-4b92421856dc\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.332028 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-combined-ca-bundle\") pod \"cf4827f7-2668-49a6-8ccf-4b92421856dc\" (UID: \"cf4827f7-2668-49a6-8ccf-4b92421856dc\") " Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.332408 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf4827f7-2668-49a6-8ccf-4b92421856dc-logs" (OuterVolumeSpecName: "logs") pod "cf4827f7-2668-49a6-8ccf-4b92421856dc" (UID: "cf4827f7-2668-49a6-8ccf-4b92421856dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.334804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4827f7-2668-49a6-8ccf-4b92421856dc-kube-api-access-hncqf" (OuterVolumeSpecName: "kube-api-access-hncqf") pod "cf4827f7-2668-49a6-8ccf-4b92421856dc" (UID: "cf4827f7-2668-49a6-8ccf-4b92421856dc"). InnerVolumeSpecName "kube-api-access-hncqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.357524 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-config-data" (OuterVolumeSpecName: "config-data") pod "cf4827f7-2668-49a6-8ccf-4b92421856dc" (UID: "cf4827f7-2668-49a6-8ccf-4b92421856dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.357858 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf4827f7-2668-49a6-8ccf-4b92421856dc" (UID: "cf4827f7-2668-49a6-8ccf-4b92421856dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.390536 4763 generic.go:334] "Generic (PLEG): container finished" podID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerID="17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84" exitCode=0 Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.390593 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.390678 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51eeac86-7412-461e-86a1-a0e3e9b23587","Type":"ContainerDied","Data":"17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84"} Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.391266 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51eeac86-7412-461e-86a1-a0e3e9b23587","Type":"ContainerDied","Data":"7354c8099749f2a212abdf82993b11686f994a43217e0090a0dead84153deb6e"} Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.391289 4763 scope.go:117] "RemoveContainer" containerID="17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.392848 4763 generic.go:334] "Generic (PLEG): container finished" podID="f07e2e80-d414-470e-82f9-ab4376fadbe4" containerID="9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe" exitCode=0 Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.392921 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.393076 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f07e2e80-d414-470e-82f9-ab4376fadbe4","Type":"ContainerDied","Data":"9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe"} Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.393173 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f07e2e80-d414-470e-82f9-ab4376fadbe4","Type":"ContainerDied","Data":"6b70e0aa56d9fc1d230b7359328a5b04a430f142b98f86fa791c34ce924d38d5"} Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.395550 4763 generic.go:334] "Generic (PLEG): container finished" podID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerID="54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3" exitCode=0 Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.395587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf4827f7-2668-49a6-8ccf-4b92421856dc","Type":"ContainerDied","Data":"54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3"} Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.395607 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf4827f7-2668-49a6-8ccf-4b92421856dc","Type":"ContainerDied","Data":"d7642857a0a1fa92576e93bb4dc1ea4f0f639168f4d6827845900a568be91fb6"} Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.395686 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.420915 4763 scope.go:117] "RemoveContainer" containerID="b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.436121 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.436167 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf4827f7-2668-49a6-8ccf-4b92421856dc-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.436186 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hncqf\" (UniqueName: \"kubernetes.io/projected/cf4827f7-2668-49a6-8ccf-4b92421856dc-kube-api-access-hncqf\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.436203 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4827f7-2668-49a6-8ccf-4b92421856dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.463838 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.471811 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.473740 4763 scope.go:117] "RemoveContainer" containerID="17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.474724 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84\": container with ID starting with 17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84 not found: ID does not exist" containerID="17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.474759 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84"} err="failed to get container status \"17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84\": rpc error: code = NotFound desc = could not find container \"17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84\": container with ID starting with 17398baa70dc9dfde76a035283d37c5a5db5a7ee7f48f134fb73332625d9ba84 not found: ID does not exist" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.474780 4763 scope.go:117] "RemoveContainer" containerID="b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.475134 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265\": container with ID starting with b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265 not found: ID does not exist" containerID="b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.475182 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265"} err="failed to get container status \"b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265\": rpc error: code = NotFound desc = could not find container \"b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265\": container with ID starting with b987e745950044f25ae314e9467960607d7471b00d8635e6c15a86d3e018b265 not found: ID does not exist" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.475212 4763 scope.go:117] "RemoveContainer" containerID="9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.481386 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.495114 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.500555 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.501098 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerName="nova-metadata-log" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501119 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerName="nova-metadata-log" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.501133 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerName="nova-api-api" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501165 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerName="nova-api-api" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.501179 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07e2e80-d414-470e-82f9-ab4376fadbe4" containerName="nova-scheduler-scheduler" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501185 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07e2e80-d414-470e-82f9-ab4376fadbe4" containerName="nova-scheduler-scheduler" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.501195 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerName="nova-api-log" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501200 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerName="nova-api-log" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.501242 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerName="nova-metadata-metadata" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501249 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerName="nova-metadata-metadata" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.501264 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18b8373-b3de-41e7-96c7-958f69e01594" containerName="nova-manage" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501269 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18b8373-b3de-41e7-96c7-958f69e01594" containerName="nova-manage" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501524 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07e2e80-d414-470e-82f9-ab4376fadbe4" containerName="nova-scheduler-scheduler" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501567 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18b8373-b3de-41e7-96c7-958f69e01594" containerName="nova-manage" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501580 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerName="nova-api-log" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501587 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerName="nova-metadata-log" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501604 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" containerName="nova-api-api" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.501662 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" containerName="nova-metadata-metadata" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.502704 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.509450 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.512859 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.514343 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.514470 4763 scope.go:117] "RemoveContainer" containerID="9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.514817 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe\": container with ID starting with 9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe not found: ID does not exist" containerID="9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.514848 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe"} err="failed to get container status \"9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe\": rpc error: code = NotFound desc = could not find container \"9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe\": container with ID starting with 9a6611f68a39b799a88fa3b9f61fe58b72c2e1127b6bfcd875bad33aca6986fe not found: ID does not exist" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.514870 4763 scope.go:117] "RemoveContainer" containerID="54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.517941 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.533689 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.536130 4763 scope.go:117] "RemoveContainer" containerID="9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.536571 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.536646 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6856d20c-96f8-415b-a86a-1231966beb28-logs\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.536677 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvg7x\" (UniqueName: \"kubernetes.io/projected/87b7f703-6648-461e-8edb-63d1351c3cbb-kube-api-access-zvg7x\") pod \"nova-scheduler-0\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.536700 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdjwr\" (UniqueName: \"kubernetes.io/projected/6856d20c-96f8-415b-a86a-1231966beb28-kube-api-access-rdjwr\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.536791 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.536847 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-config-data\") pod \"nova-scheduler-0\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.536897 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-config-data\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.545031 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.558855 4763 scope.go:117] "RemoveContainer" containerID="54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.559308 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3\": container with ID starting with 54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3 not found: ID does not exist" containerID="54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.559366 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3"} err="failed to get container status \"54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3\": rpc error: code = NotFound desc = could not find container \"54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3\": container with ID starting with 54427246f5f44c0bf387aafed9bb9d52b65cad33d0f6b4aaec4e86a61cf415c3 not found: ID does not exist" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.559393 4763 scope.go:117] "RemoveContainer" containerID="9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d" Oct 06 16:25:43 crc kubenswrapper[4763]: E1006 16:25:43.559675 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d\": container with ID starting with 9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d not found: ID does not exist" containerID="9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.559712 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d"} err="failed to get container status \"9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d\": rpc error: code = NotFound desc = could not find container \"9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d\": container with ID starting with 9a34d4293e984273841443bb8dd8b84ed00c6d4680e4bb6824ef54bbe6f0ec0d not found: ID does not exist" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.585272 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf4827f7-2668-49a6-8ccf-4b92421856dc" path="/var/lib/kubelet/pods/cf4827f7-2668-49a6-8ccf-4b92421856dc/volumes" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.585944 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07e2e80-d414-470e-82f9-ab4376fadbe4" path="/var/lib/kubelet/pods/f07e2e80-d414-470e-82f9-ab4376fadbe4/volumes" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.638816 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.638906 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6856d20c-96f8-415b-a86a-1231966beb28-logs\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.638937 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvg7x\" (UniqueName: \"kubernetes.io/projected/87b7f703-6648-461e-8edb-63d1351c3cbb-kube-api-access-zvg7x\") pod \"nova-scheduler-0\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.638963 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdjwr\" (UniqueName: \"kubernetes.io/projected/6856d20c-96f8-415b-a86a-1231966beb28-kube-api-access-rdjwr\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.638986 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.639010 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-config-data\") pod \"nova-scheduler-0\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.639043 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-config-data\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.640451 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6856d20c-96f8-415b-a86a-1231966beb28-logs\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.642846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-config-data\") pod \"nova-scheduler-0\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.643868 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-config-data\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.644040 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.645861 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.652979 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdjwr\" (UniqueName: \"kubernetes.io/projected/6856d20c-96f8-415b-a86a-1231966beb28-kube-api-access-rdjwr\") pod \"nova-metadata-0\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.654453 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvg7x\" (UniqueName: \"kubernetes.io/projected/87b7f703-6648-461e-8edb-63d1351c3cbb-kube-api-access-zvg7x\") pod \"nova-scheduler-0\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " pod="openstack/nova-scheduler-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.740554 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-config-data\") pod \"51eeac86-7412-461e-86a1-a0e3e9b23587\" (UID: \"51eeac86-7412-461e-86a1-a0e3e9b23587\") " Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.743242 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-config-data" (OuterVolumeSpecName: "config-data") pod "51eeac86-7412-461e-86a1-a0e3e9b23587" (UID: "51eeac86-7412-461e-86a1-a0e3e9b23587"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.838512 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.843048 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eeac86-7412-461e-86a1-a0e3e9b23587-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:25:43 crc kubenswrapper[4763]: I1006 16:25:43.847496 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.049424 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.099070 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.110676 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.112262 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.114306 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.121298 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.188966 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.189024 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36320b5b-5ae6-4e50-af8e-b8dca958b12e-logs\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.189050 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twncb\" (UniqueName: \"kubernetes.io/projected/36320b5b-5ae6-4e50-af8e-b8dca958b12e-kube-api-access-twncb\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.189074 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-config-data\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.290502 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.290657 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36320b5b-5ae6-4e50-af8e-b8dca958b12e-logs\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.290774 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twncb\" (UniqueName: \"kubernetes.io/projected/36320b5b-5ae6-4e50-af8e-b8dca958b12e-kube-api-access-twncb\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.290949 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-config-data\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.291457 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36320b5b-5ae6-4e50-af8e-b8dca958b12e-logs\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.299659 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-config-data\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.300289 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.308341 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twncb\" (UniqueName: \"kubernetes.io/projected/36320b5b-5ae6-4e50-af8e-b8dca958b12e-kube-api-access-twncb\") pod \"nova-api-0\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.380041 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:25:44 crc kubenswrapper[4763]: W1006 16:25:44.385315 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87b7f703_6648_461e_8edb_63d1351c3cbb.slice/crio-f5b0221461d6a869ccbfbacabba2b9212add8e45593c7bf492f3c6fe56d9aade WatchSource:0}: Error finding container f5b0221461d6a869ccbfbacabba2b9212add8e45593c7bf492f3c6fe56d9aade: Status 404 returned error can't find the container with id f5b0221461d6a869ccbfbacabba2b9212add8e45593c7bf492f3c6fe56d9aade Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.430803 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87b7f703-6648-461e-8edb-63d1351c3cbb","Type":"ContainerStarted","Data":"f5b0221461d6a869ccbfbacabba2b9212add8e45593c7bf492f3c6fe56d9aade"} Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.431349 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.431791 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:25:44 crc kubenswrapper[4763]: W1006 16:25:44.440674 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6856d20c_96f8_415b_a86a_1231966beb28.slice/crio-2ea31026be0a9252493b9e4cb07d17781bc236a3f3620ff91fe9a338ac6383e7 WatchSource:0}: Error finding container 2ea31026be0a9252493b9e4cb07d17781bc236a3f3620ff91fe9a338ac6383e7: Status 404 returned error can't find the container with id 2ea31026be0a9252493b9e4cb07d17781bc236a3f3620ff91fe9a338ac6383e7 Oct 06 16:25:44 crc kubenswrapper[4763]: I1006 16:25:44.695414 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:25:44 crc kubenswrapper[4763]: W1006 16:25:44.698481 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36320b5b_5ae6_4e50_af8e_b8dca958b12e.slice/crio-8b507e8c726fe686527993690ff8d592fc214d0d12501e5eb50a0f365a099684 WatchSource:0}: Error finding container 8b507e8c726fe686527993690ff8d592fc214d0d12501e5eb50a0f365a099684: Status 404 returned error can't find the container with id 8b507e8c726fe686527993690ff8d592fc214d0d12501e5eb50a0f365a099684 Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.442721 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6856d20c-96f8-415b-a86a-1231966beb28","Type":"ContainerStarted","Data":"7303d1a0cfb0b2bf8c33f9046e072c23106bf7cea6c6b2d15cee798002c214c1"} Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.443022 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6856d20c-96f8-415b-a86a-1231966beb28","Type":"ContainerStarted","Data":"077e2a22de54d1eb090b9182c659466749abb5d721bfedc293afdab7753a7ef5"} Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.443044 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6856d20c-96f8-415b-a86a-1231966beb28","Type":"ContainerStarted","Data":"2ea31026be0a9252493b9e4cb07d17781bc236a3f3620ff91fe9a338ac6383e7"} Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.446163 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87b7f703-6648-461e-8edb-63d1351c3cbb","Type":"ContainerStarted","Data":"b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e"} Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.450376 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36320b5b-5ae6-4e50-af8e-b8dca958b12e","Type":"ContainerStarted","Data":"a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525"} Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.450515 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36320b5b-5ae6-4e50-af8e-b8dca958b12e","Type":"ContainerStarted","Data":"7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767"} Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.450602 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36320b5b-5ae6-4e50-af8e-b8dca958b12e","Type":"ContainerStarted","Data":"8b507e8c726fe686527993690ff8d592fc214d0d12501e5eb50a0f365a099684"} Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.473392 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.473369129 podStartE2EDuration="2.473369129s" podCreationTimestamp="2025-10-06 16:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:45.461754558 +0000 UTC m=+5542.617047080" watchObservedRunningTime="2025-10-06 16:25:45.473369129 +0000 UTC m=+5542.628661661" Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.495938 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.495915933 podStartE2EDuration="2.495915933s" podCreationTimestamp="2025-10-06 16:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:45.486401768 +0000 UTC m=+5542.641694290" watchObservedRunningTime="2025-10-06 16:25:45.495915933 +0000 UTC m=+5542.651208455" Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.516288 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.516267588 podStartE2EDuration="1.516267588s" podCreationTimestamp="2025-10-06 16:25:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:25:45.510042741 +0000 UTC m=+5542.665335273" watchObservedRunningTime="2025-10-06 16:25:45.516267588 +0000 UTC m=+5542.671560110" Oct 06 16:25:45 crc kubenswrapper[4763]: I1006 16:25:45.588417 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51eeac86-7412-461e-86a1-a0e3e9b23587" path="/var/lib/kubelet/pods/51eeac86-7412-461e-86a1-a0e3e9b23587/volumes" Oct 06 16:25:47 crc kubenswrapper[4763]: I1006 16:25:47.575058 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:25:47 crc kubenswrapper[4763]: E1006 16:25:47.576488 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:25:48 crc kubenswrapper[4763]: I1006 16:25:48.839646 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 16:25:48 crc kubenswrapper[4763]: I1006 16:25:48.839730 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 16:25:48 crc kubenswrapper[4763]: I1006 16:25:48.848028 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 16:25:53 crc kubenswrapper[4763]: I1006 16:25:53.838853 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 16:25:53 crc kubenswrapper[4763]: I1006 16:25:53.841198 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 16:25:53 crc kubenswrapper[4763]: I1006 16:25:53.847651 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 16:25:53 crc kubenswrapper[4763]: I1006 16:25:53.873214 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 16:25:54 crc kubenswrapper[4763]: I1006 16:25:54.433106 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 16:25:54 crc kubenswrapper[4763]: I1006 16:25:54.433326 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 16:25:54 crc kubenswrapper[4763]: I1006 16:25:54.592244 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 16:25:54 crc kubenswrapper[4763]: I1006 16:25:54.920810 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:25:54 crc kubenswrapper[4763]: I1006 16:25:54.920862 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:25:55 crc kubenswrapper[4763]: I1006 16:25:55.515836 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:25:55 crc kubenswrapper[4763]: I1006 16:25:55.515858 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:26:02 crc kubenswrapper[4763]: I1006 16:26:02.575124 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:26:02 crc kubenswrapper[4763]: E1006 16:26:02.576057 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:26:03 crc kubenswrapper[4763]: I1006 16:26:03.840661 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 16:26:03 crc kubenswrapper[4763]: I1006 16:26:03.840982 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 16:26:03 crc kubenswrapper[4763]: I1006 16:26:03.842509 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 16:26:03 crc kubenswrapper[4763]: I1006 16:26:03.843236 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.436664 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.436994 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.437379 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.440207 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.645451 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.649109 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.854300 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-c5zvs"] Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.857767 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.898572 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-c5zvs"] Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.992192 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-sb\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.992482 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-config\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.992519 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-475rp\" (UniqueName: \"kubernetes.io/projected/a9c692a9-f907-437b-a5cf-51e48a5450e2-kube-api-access-475rp\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.992565 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-dns-svc\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:04 crc kubenswrapper[4763]: I1006 16:26:04.992595 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-nb\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.094594 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-sb\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.094667 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-config\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.094697 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-475rp\" (UniqueName: \"kubernetes.io/projected/a9c692a9-f907-437b-a5cf-51e48a5450e2-kube-api-access-475rp\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.094731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-dns-svc\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.094753 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-nb\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.107149 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-nb\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.107694 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-config\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.108004 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-dns-svc\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.108149 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-sb\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.130416 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-475rp\" (UniqueName: \"kubernetes.io/projected/a9c692a9-f907-437b-a5cf-51e48a5450e2-kube-api-access-475rp\") pod \"dnsmasq-dns-85c7886d8f-c5zvs\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.198652 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:05 crc kubenswrapper[4763]: I1006 16:26:05.662492 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-c5zvs"] Oct 06 16:26:06 crc kubenswrapper[4763]: I1006 16:26:06.682703 4763 generic.go:334] "Generic (PLEG): container finished" podID="a9c692a9-f907-437b-a5cf-51e48a5450e2" containerID="cd25a07340c1b4d1e77a502ea88783280cf8e7daf1d40a8383951fb9fa4f3edc" exitCode=0 Oct 06 16:26:06 crc kubenswrapper[4763]: I1006 16:26:06.684182 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" event={"ID":"a9c692a9-f907-437b-a5cf-51e48a5450e2","Type":"ContainerDied","Data":"cd25a07340c1b4d1e77a502ea88783280cf8e7daf1d40a8383951fb9fa4f3edc"} Oct 06 16:26:06 crc kubenswrapper[4763]: I1006 16:26:06.684222 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" event={"ID":"a9c692a9-f907-437b-a5cf-51e48a5450e2","Type":"ContainerStarted","Data":"3f44e3bbcdde85abf1e3fa1cc68d11cf6d1ae6f189aac5557626240535fa3840"} Oct 06 16:26:07 crc kubenswrapper[4763]: I1006 16:26:07.693186 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" event={"ID":"a9c692a9-f907-437b-a5cf-51e48a5450e2","Type":"ContainerStarted","Data":"6af3a4b3a9ce91a918c25d0add4ff1fc23a2b63506269855094986a350dc9dc1"} Oct 06 16:26:07 crc kubenswrapper[4763]: I1006 16:26:07.694662 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:07 crc kubenswrapper[4763]: I1006 16:26:07.712409 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" podStartSLOduration=3.712392341 podStartE2EDuration="3.712392341s" podCreationTimestamp="2025-10-06 16:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:26:07.708368183 +0000 UTC m=+5564.863660695" watchObservedRunningTime="2025-10-06 16:26:07.712392341 +0000 UTC m=+5564.867684853" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.201027 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.270385 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-s6l89"] Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.271055 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" podUID="31c54c0b-4550-41a7-a851-5aed75eee87e" containerName="dnsmasq-dns" containerID="cri-o://179a291999c36f551433be402096687932bbb292cc0bfb73cd3377bec7f81ef1" gracePeriod=10 Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.793860 4763 generic.go:334] "Generic (PLEG): container finished" podID="31c54c0b-4550-41a7-a851-5aed75eee87e" containerID="179a291999c36f551433be402096687932bbb292cc0bfb73cd3377bec7f81ef1" exitCode=0 Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.793941 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" event={"ID":"31c54c0b-4550-41a7-a851-5aed75eee87e","Type":"ContainerDied","Data":"179a291999c36f551433be402096687932bbb292cc0bfb73cd3377bec7f81ef1"} Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.794187 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" event={"ID":"31c54c0b-4550-41a7-a851-5aed75eee87e","Type":"ContainerDied","Data":"afd38c41f5cff00421924aec2c475c35c18ca5a3d13bff0e9a73dae0347d06cf"} Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.794206 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afd38c41f5cff00421924aec2c475c35c18ca5a3d13bff0e9a73dae0347d06cf" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.799115 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.835464 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-dns-svc\") pod \"31c54c0b-4550-41a7-a851-5aed75eee87e\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.835530 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-sb\") pod \"31c54c0b-4550-41a7-a851-5aed75eee87e\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.835582 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-nb\") pod \"31c54c0b-4550-41a7-a851-5aed75eee87e\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.835745 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-config\") pod \"31c54c0b-4550-41a7-a851-5aed75eee87e\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.836393 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh755\" (UniqueName: \"kubernetes.io/projected/31c54c0b-4550-41a7-a851-5aed75eee87e-kube-api-access-dh755\") pod \"31c54c0b-4550-41a7-a851-5aed75eee87e\" (UID: \"31c54c0b-4550-41a7-a851-5aed75eee87e\") " Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.843434 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31c54c0b-4550-41a7-a851-5aed75eee87e-kube-api-access-dh755" (OuterVolumeSpecName: "kube-api-access-dh755") pod "31c54c0b-4550-41a7-a851-5aed75eee87e" (UID: "31c54c0b-4550-41a7-a851-5aed75eee87e"). InnerVolumeSpecName "kube-api-access-dh755". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.879412 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31c54c0b-4550-41a7-a851-5aed75eee87e" (UID: "31c54c0b-4550-41a7-a851-5aed75eee87e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.894805 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31c54c0b-4550-41a7-a851-5aed75eee87e" (UID: "31c54c0b-4550-41a7-a851-5aed75eee87e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.902566 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-config" (OuterVolumeSpecName: "config") pod "31c54c0b-4550-41a7-a851-5aed75eee87e" (UID: "31c54c0b-4550-41a7-a851-5aed75eee87e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.910572 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31c54c0b-4550-41a7-a851-5aed75eee87e" (UID: "31c54c0b-4550-41a7-a851-5aed75eee87e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.938724 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh755\" (UniqueName: \"kubernetes.io/projected/31c54c0b-4550-41a7-a851-5aed75eee87e-kube-api-access-dh755\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.938755 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.938764 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.938773 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:15 crc kubenswrapper[4763]: I1006 16:26:15.938782 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31c54c0b-4550-41a7-a851-5aed75eee87e-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:16 crc kubenswrapper[4763]: I1006 16:26:16.802529 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dc7db885-s6l89" Oct 06 16:26:16 crc kubenswrapper[4763]: I1006 16:26:16.840019 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-s6l89"] Oct 06 16:26:16 crc kubenswrapper[4763]: I1006 16:26:16.846982 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-s6l89"] Oct 06 16:26:17 crc kubenswrapper[4763]: I1006 16:26:17.575422 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:26:17 crc kubenswrapper[4763]: E1006 16:26:17.575920 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:26:17 crc kubenswrapper[4763]: I1006 16:26:17.586596 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31c54c0b-4550-41a7-a851-5aed75eee87e" path="/var/lib/kubelet/pods/31c54c0b-4550-41a7-a851-5aed75eee87e/volumes" Oct 06 16:26:18 crc kubenswrapper[4763]: I1006 16:26:18.989389 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-tx54f"] Oct 06 16:26:18 crc kubenswrapper[4763]: E1006 16:26:18.990092 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c54c0b-4550-41a7-a851-5aed75eee87e" containerName="init" Oct 06 16:26:18 crc kubenswrapper[4763]: I1006 16:26:18.990106 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c54c0b-4550-41a7-a851-5aed75eee87e" containerName="init" Oct 06 16:26:18 crc kubenswrapper[4763]: E1006 16:26:18.990131 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c54c0b-4550-41a7-a851-5aed75eee87e" containerName="dnsmasq-dns" Oct 06 16:26:18 crc kubenswrapper[4763]: I1006 16:26:18.990137 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c54c0b-4550-41a7-a851-5aed75eee87e" containerName="dnsmasq-dns" Oct 06 16:26:18 crc kubenswrapper[4763]: I1006 16:26:18.990298 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="31c54c0b-4550-41a7-a851-5aed75eee87e" containerName="dnsmasq-dns" Oct 06 16:26:18 crc kubenswrapper[4763]: I1006 16:26:18.991011 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tx54f" Oct 06 16:26:19 crc kubenswrapper[4763]: I1006 16:26:19.020692 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tx54f"] Oct 06 16:26:19 crc kubenswrapper[4763]: I1006 16:26:19.096027 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj7mk\" (UniqueName: \"kubernetes.io/projected/16239947-411d-416c-be77-4b34624cfc2f-kube-api-access-hj7mk\") pod \"cinder-db-create-tx54f\" (UID: \"16239947-411d-416c-be77-4b34624cfc2f\") " pod="openstack/cinder-db-create-tx54f" Oct 06 16:26:19 crc kubenswrapper[4763]: I1006 16:26:19.198092 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj7mk\" (UniqueName: \"kubernetes.io/projected/16239947-411d-416c-be77-4b34624cfc2f-kube-api-access-hj7mk\") pod \"cinder-db-create-tx54f\" (UID: \"16239947-411d-416c-be77-4b34624cfc2f\") " pod="openstack/cinder-db-create-tx54f" Oct 06 16:26:19 crc kubenswrapper[4763]: I1006 16:26:19.216394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj7mk\" (UniqueName: \"kubernetes.io/projected/16239947-411d-416c-be77-4b34624cfc2f-kube-api-access-hj7mk\") pod \"cinder-db-create-tx54f\" (UID: \"16239947-411d-416c-be77-4b34624cfc2f\") " pod="openstack/cinder-db-create-tx54f" Oct 06 16:26:19 crc kubenswrapper[4763]: I1006 16:26:19.314378 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tx54f" Oct 06 16:26:19 crc kubenswrapper[4763]: I1006 16:26:19.756939 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tx54f"] Oct 06 16:26:19 crc kubenswrapper[4763]: I1006 16:26:19.844079 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tx54f" event={"ID":"16239947-411d-416c-be77-4b34624cfc2f","Type":"ContainerStarted","Data":"6c59db991bf0c96ae75c47b166c19ff75462e59cff5a978338fb5d63f3cfb164"} Oct 06 16:26:20 crc kubenswrapper[4763]: I1006 16:26:20.856843 4763 generic.go:334] "Generic (PLEG): container finished" podID="16239947-411d-416c-be77-4b34624cfc2f" containerID="2fb93126c951c2d488f9aff1999eca5bb26a7a76260b06b3634316ee68445098" exitCode=0 Oct 06 16:26:20 crc kubenswrapper[4763]: I1006 16:26:20.856982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tx54f" event={"ID":"16239947-411d-416c-be77-4b34624cfc2f","Type":"ContainerDied","Data":"2fb93126c951c2d488f9aff1999eca5bb26a7a76260b06b3634316ee68445098"} Oct 06 16:26:22 crc kubenswrapper[4763]: I1006 16:26:22.277679 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tx54f" Oct 06 16:26:22 crc kubenswrapper[4763]: I1006 16:26:22.356804 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj7mk\" (UniqueName: \"kubernetes.io/projected/16239947-411d-416c-be77-4b34624cfc2f-kube-api-access-hj7mk\") pod \"16239947-411d-416c-be77-4b34624cfc2f\" (UID: \"16239947-411d-416c-be77-4b34624cfc2f\") " Oct 06 16:26:22 crc kubenswrapper[4763]: I1006 16:26:22.362246 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16239947-411d-416c-be77-4b34624cfc2f-kube-api-access-hj7mk" (OuterVolumeSpecName: "kube-api-access-hj7mk") pod "16239947-411d-416c-be77-4b34624cfc2f" (UID: "16239947-411d-416c-be77-4b34624cfc2f"). InnerVolumeSpecName "kube-api-access-hj7mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:22 crc kubenswrapper[4763]: I1006 16:26:22.458612 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj7mk\" (UniqueName: \"kubernetes.io/projected/16239947-411d-416c-be77-4b34624cfc2f-kube-api-access-hj7mk\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:22 crc kubenswrapper[4763]: I1006 16:26:22.886662 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tx54f" event={"ID":"16239947-411d-416c-be77-4b34624cfc2f","Type":"ContainerDied","Data":"6c59db991bf0c96ae75c47b166c19ff75462e59cff5a978338fb5d63f3cfb164"} Oct 06 16:26:22 crc kubenswrapper[4763]: I1006 16:26:22.886712 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c59db991bf0c96ae75c47b166c19ff75462e59cff5a978338fb5d63f3cfb164" Oct 06 16:26:22 crc kubenswrapper[4763]: I1006 16:26:22.887422 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tx54f" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.051694 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-bf6a-account-create-g9rrk"] Oct 06 16:26:29 crc kubenswrapper[4763]: E1006 16:26:29.052960 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16239947-411d-416c-be77-4b34624cfc2f" containerName="mariadb-database-create" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.052981 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="16239947-411d-416c-be77-4b34624cfc2f" containerName="mariadb-database-create" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.053256 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="16239947-411d-416c-be77-4b34624cfc2f" containerName="mariadb-database-create" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.054073 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bf6a-account-create-g9rrk" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.058385 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.063928 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bf6a-account-create-g9rrk"] Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.184754 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvx5s\" (UniqueName: \"kubernetes.io/projected/952bb439-e564-4c01-9d6f-07b1e941434a-kube-api-access-lvx5s\") pod \"cinder-bf6a-account-create-g9rrk\" (UID: \"952bb439-e564-4c01-9d6f-07b1e941434a\") " pod="openstack/cinder-bf6a-account-create-g9rrk" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.286920 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvx5s\" (UniqueName: \"kubernetes.io/projected/952bb439-e564-4c01-9d6f-07b1e941434a-kube-api-access-lvx5s\") pod \"cinder-bf6a-account-create-g9rrk\" (UID: \"952bb439-e564-4c01-9d6f-07b1e941434a\") " pod="openstack/cinder-bf6a-account-create-g9rrk" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.320340 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvx5s\" (UniqueName: \"kubernetes.io/projected/952bb439-e564-4c01-9d6f-07b1e941434a-kube-api-access-lvx5s\") pod \"cinder-bf6a-account-create-g9rrk\" (UID: \"952bb439-e564-4c01-9d6f-07b1e941434a\") " pod="openstack/cinder-bf6a-account-create-g9rrk" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.390306 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bf6a-account-create-g9rrk" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.575140 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:26:29 crc kubenswrapper[4763]: E1006 16:26:29.575702 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.871650 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bf6a-account-create-g9rrk"] Oct 06 16:26:29 crc kubenswrapper[4763]: W1006 16:26:29.876858 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod952bb439_e564_4c01_9d6f_07b1e941434a.slice/crio-fa8b86619e28381598af43a0a718bd20952dbc3d525466b50631ff09fa2e7cab WatchSource:0}: Error finding container fa8b86619e28381598af43a0a718bd20952dbc3d525466b50631ff09fa2e7cab: Status 404 returned error can't find the container with id fa8b86619e28381598af43a0a718bd20952dbc3d525466b50631ff09fa2e7cab Oct 06 16:26:29 crc kubenswrapper[4763]: I1006 16:26:29.973419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bf6a-account-create-g9rrk" event={"ID":"952bb439-e564-4c01-9d6f-07b1e941434a","Type":"ContainerStarted","Data":"fa8b86619e28381598af43a0a718bd20952dbc3d525466b50631ff09fa2e7cab"} Oct 06 16:26:30 crc kubenswrapper[4763]: I1006 16:26:30.988425 4763 generic.go:334] "Generic (PLEG): container finished" podID="952bb439-e564-4c01-9d6f-07b1e941434a" containerID="25c9fa62dc673320e1af2c553326145d88577980050a9e6653b43522d803a339" exitCode=0 Oct 06 16:26:30 crc kubenswrapper[4763]: I1006 16:26:30.988512 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bf6a-account-create-g9rrk" event={"ID":"952bb439-e564-4c01-9d6f-07b1e941434a","Type":"ContainerDied","Data":"25c9fa62dc673320e1af2c553326145d88577980050a9e6653b43522d803a339"} Oct 06 16:26:32 crc kubenswrapper[4763]: I1006 16:26:32.412145 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bf6a-account-create-g9rrk" Oct 06 16:26:32 crc kubenswrapper[4763]: I1006 16:26:32.450244 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvx5s\" (UniqueName: \"kubernetes.io/projected/952bb439-e564-4c01-9d6f-07b1e941434a-kube-api-access-lvx5s\") pod \"952bb439-e564-4c01-9d6f-07b1e941434a\" (UID: \"952bb439-e564-4c01-9d6f-07b1e941434a\") " Oct 06 16:26:32 crc kubenswrapper[4763]: I1006 16:26:32.457465 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952bb439-e564-4c01-9d6f-07b1e941434a-kube-api-access-lvx5s" (OuterVolumeSpecName: "kube-api-access-lvx5s") pod "952bb439-e564-4c01-9d6f-07b1e941434a" (UID: "952bb439-e564-4c01-9d6f-07b1e941434a"). InnerVolumeSpecName "kube-api-access-lvx5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:32 crc kubenswrapper[4763]: I1006 16:26:32.552093 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvx5s\" (UniqueName: \"kubernetes.io/projected/952bb439-e564-4c01-9d6f-07b1e941434a-kube-api-access-lvx5s\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:33 crc kubenswrapper[4763]: I1006 16:26:33.014644 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bf6a-account-create-g9rrk" event={"ID":"952bb439-e564-4c01-9d6f-07b1e941434a","Type":"ContainerDied","Data":"fa8b86619e28381598af43a0a718bd20952dbc3d525466b50631ff09fa2e7cab"} Oct 06 16:26:33 crc kubenswrapper[4763]: I1006 16:26:33.014688 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa8b86619e28381598af43a0a718bd20952dbc3d525466b50631ff09fa2e7cab" Oct 06 16:26:33 crc kubenswrapper[4763]: I1006 16:26:33.014710 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bf6a-account-create-g9rrk" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.220804 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-cnfnt"] Oct 06 16:26:34 crc kubenswrapper[4763]: E1006 16:26:34.221679 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952bb439-e564-4c01-9d6f-07b1e941434a" containerName="mariadb-account-create" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.221706 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="952bb439-e564-4c01-9d6f-07b1e941434a" containerName="mariadb-account-create" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.221952 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="952bb439-e564-4c01-9d6f-07b1e941434a" containerName="mariadb-account-create" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.222849 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.226745 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lc7wc" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.227012 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.227279 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.229194 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-cnfnt"] Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.290350 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801e07ad-1c28-4afe-a196-77098d825544-etc-machine-id\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.290402 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-combined-ca-bundle\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.290431 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-scripts\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.290546 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-db-sync-config-data\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.290646 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-config-data\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.290810 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5hdx\" (UniqueName: \"kubernetes.io/projected/801e07ad-1c28-4afe-a196-77098d825544-kube-api-access-m5hdx\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.392578 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5hdx\" (UniqueName: \"kubernetes.io/projected/801e07ad-1c28-4afe-a196-77098d825544-kube-api-access-m5hdx\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.392977 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801e07ad-1c28-4afe-a196-77098d825544-etc-machine-id\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.393092 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-combined-ca-bundle\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.393108 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801e07ad-1c28-4afe-a196-77098d825544-etc-machine-id\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.393193 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-scripts\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.393326 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-db-sync-config-data\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.393390 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-config-data\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.397013 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-combined-ca-bundle\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.397162 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-scripts\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.397449 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-config-data\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.397943 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-db-sync-config-data\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.414578 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5hdx\" (UniqueName: \"kubernetes.io/projected/801e07ad-1c28-4afe-a196-77098d825544-kube-api-access-m5hdx\") pod \"cinder-db-sync-cnfnt\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.541812 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:34 crc kubenswrapper[4763]: I1006 16:26:34.983779 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-cnfnt"] Oct 06 16:26:35 crc kubenswrapper[4763]: I1006 16:26:35.035049 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cnfnt" event={"ID":"801e07ad-1c28-4afe-a196-77098d825544","Type":"ContainerStarted","Data":"7e7f8c41034d8d4bd603bb88df2879cea2e493cbda788fea1bd83091d5814ec0"} Oct 06 16:26:36 crc kubenswrapper[4763]: I1006 16:26:36.054680 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cnfnt" event={"ID":"801e07ad-1c28-4afe-a196-77098d825544","Type":"ContainerStarted","Data":"0ea19e1c9094053e750b7a5aadd32c1483cfb26c4e021366e6930b9aaae6fd7a"} Oct 06 16:26:36 crc kubenswrapper[4763]: I1006 16:26:36.075726 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-cnfnt" podStartSLOduration=2.075708851 podStartE2EDuration="2.075708851s" podCreationTimestamp="2025-10-06 16:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:26:36.071742935 +0000 UTC m=+5593.227035447" watchObservedRunningTime="2025-10-06 16:26:36.075708851 +0000 UTC m=+5593.231001363" Oct 06 16:26:38 crc kubenswrapper[4763]: I1006 16:26:38.077965 4763 generic.go:334] "Generic (PLEG): container finished" podID="801e07ad-1c28-4afe-a196-77098d825544" containerID="0ea19e1c9094053e750b7a5aadd32c1483cfb26c4e021366e6930b9aaae6fd7a" exitCode=0 Oct 06 16:26:38 crc kubenswrapper[4763]: I1006 16:26:38.078091 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cnfnt" event={"ID":"801e07ad-1c28-4afe-a196-77098d825544","Type":"ContainerDied","Data":"0ea19e1c9094053e750b7a5aadd32c1483cfb26c4e021366e6930b9aaae6fd7a"} Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.444993 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.594850 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-scripts\") pod \"801e07ad-1c28-4afe-a196-77098d825544\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.595034 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801e07ad-1c28-4afe-a196-77098d825544-etc-machine-id\") pod \"801e07ad-1c28-4afe-a196-77098d825544\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.595123 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-combined-ca-bundle\") pod \"801e07ad-1c28-4afe-a196-77098d825544\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.595177 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-config-data\") pod \"801e07ad-1c28-4afe-a196-77098d825544\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.595192 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801e07ad-1c28-4afe-a196-77098d825544-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "801e07ad-1c28-4afe-a196-77098d825544" (UID: "801e07ad-1c28-4afe-a196-77098d825544"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.595863 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5hdx\" (UniqueName: \"kubernetes.io/projected/801e07ad-1c28-4afe-a196-77098d825544-kube-api-access-m5hdx\") pod \"801e07ad-1c28-4afe-a196-77098d825544\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.595945 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-db-sync-config-data\") pod \"801e07ad-1c28-4afe-a196-77098d825544\" (UID: \"801e07ad-1c28-4afe-a196-77098d825544\") " Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.596457 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801e07ad-1c28-4afe-a196-77098d825544-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.605901 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-scripts" (OuterVolumeSpecName: "scripts") pod "801e07ad-1c28-4afe-a196-77098d825544" (UID: "801e07ad-1c28-4afe-a196-77098d825544"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.605945 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801e07ad-1c28-4afe-a196-77098d825544-kube-api-access-m5hdx" (OuterVolumeSpecName: "kube-api-access-m5hdx") pod "801e07ad-1c28-4afe-a196-77098d825544" (UID: "801e07ad-1c28-4afe-a196-77098d825544"). InnerVolumeSpecName "kube-api-access-m5hdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.605977 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "801e07ad-1c28-4afe-a196-77098d825544" (UID: "801e07ad-1c28-4afe-a196-77098d825544"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.623794 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "801e07ad-1c28-4afe-a196-77098d825544" (UID: "801e07ad-1c28-4afe-a196-77098d825544"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.642842 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-config-data" (OuterVolumeSpecName: "config-data") pod "801e07ad-1c28-4afe-a196-77098d825544" (UID: "801e07ad-1c28-4afe-a196-77098d825544"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.698511 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.698756 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.698818 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5hdx\" (UniqueName: \"kubernetes.io/projected/801e07ad-1c28-4afe-a196-77098d825544-kube-api-access-m5hdx\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.698902 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:39 crc kubenswrapper[4763]: I1006 16:26:39.698958 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801e07ad-1c28-4afe-a196-77098d825544-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.099121 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cnfnt" event={"ID":"801e07ad-1c28-4afe-a196-77098d825544","Type":"ContainerDied","Data":"7e7f8c41034d8d4bd603bb88df2879cea2e493cbda788fea1bd83091d5814ec0"} Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.099162 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7f8c41034d8d4bd603bb88df2879cea2e493cbda788fea1bd83091d5814ec0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.099213 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cnfnt" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.418563 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-n2jvs"] Oct 06 16:26:40 crc kubenswrapper[4763]: E1006 16:26:40.419083 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801e07ad-1c28-4afe-a196-77098d825544" containerName="cinder-db-sync" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.419108 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="801e07ad-1c28-4afe-a196-77098d825544" containerName="cinder-db-sync" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.419349 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="801e07ad-1c28-4afe-a196-77098d825544" containerName="cinder-db-sync" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.420538 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.438587 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-n2jvs"] Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.514628 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-config\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.514713 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-dns-svc\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.514740 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rttds\" (UniqueName: \"kubernetes.io/projected/1317520f-29bf-4615-b385-b03a9ffa898e-kube-api-access-rttds\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.514822 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-nb\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.515153 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-sb\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.578858 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.580445 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.582055 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.583905 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.584177 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lc7wc" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.584355 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.594917 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.616343 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-sb\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.616406 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-config\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.616447 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-dns-svc\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.616465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rttds\" (UniqueName: \"kubernetes.io/projected/1317520f-29bf-4615-b385-b03a9ffa898e-kube-api-access-rttds\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.616499 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-nb\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.617566 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-sb\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.617630 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-config\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.618107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-nb\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.618338 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-dns-svc\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.636073 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rttds\" (UniqueName: \"kubernetes.io/projected/1317520f-29bf-4615-b385-b03a9ffa898e-kube-api-access-rttds\") pod \"dnsmasq-dns-7784748f7f-n2jvs\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.718882 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mljvv\" (UniqueName: \"kubernetes.io/projected/eeabf008-f00e-4394-a691-affd62647290-kube-api-access-mljvv\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.719118 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.719318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeabf008-f00e-4394-a691-affd62647290-logs\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.719394 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-scripts\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.719475 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eeabf008-f00e-4394-a691-affd62647290-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.719539 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.719765 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data-custom\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.736003 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.821835 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.821947 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data-custom\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.821989 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mljvv\" (UniqueName: \"kubernetes.io/projected/eeabf008-f00e-4394-a691-affd62647290-kube-api-access-mljvv\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.822058 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.822112 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeabf008-f00e-4394-a691-affd62647290-logs\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.822142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-scripts\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.822183 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eeabf008-f00e-4394-a691-affd62647290-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.822268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eeabf008-f00e-4394-a691-affd62647290-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.825004 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeabf008-f00e-4394-a691-affd62647290-logs\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.831734 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.834081 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-scripts\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.834167 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data-custom\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.842982 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.859311 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mljvv\" (UniqueName: \"kubernetes.io/projected/eeabf008-f00e-4394-a691-affd62647290-kube-api-access-mljvv\") pod \"cinder-api-0\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " pod="openstack/cinder-api-0" Oct 06 16:26:40 crc kubenswrapper[4763]: I1006 16:26:40.901036 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 16:26:41 crc kubenswrapper[4763]: I1006 16:26:41.232099 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-n2jvs"] Oct 06 16:26:41 crc kubenswrapper[4763]: I1006 16:26:41.389193 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 16:26:41 crc kubenswrapper[4763]: W1006 16:26:41.395013 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeabf008_f00e_4394_a691_affd62647290.slice/crio-b1869bdf447703fc95201d8e47f662cd1839d798b79a09e08ba8f9e2d4795a84 WatchSource:0}: Error finding container b1869bdf447703fc95201d8e47f662cd1839d798b79a09e08ba8f9e2d4795a84: Status 404 returned error can't find the container with id b1869bdf447703fc95201d8e47f662cd1839d798b79a09e08ba8f9e2d4795a84 Oct 06 16:26:42 crc kubenswrapper[4763]: I1006 16:26:42.126360 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eeabf008-f00e-4394-a691-affd62647290","Type":"ContainerStarted","Data":"35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76"} Oct 06 16:26:42 crc kubenswrapper[4763]: I1006 16:26:42.126710 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eeabf008-f00e-4394-a691-affd62647290","Type":"ContainerStarted","Data":"b1869bdf447703fc95201d8e47f662cd1839d798b79a09e08ba8f9e2d4795a84"} Oct 06 16:26:42 crc kubenswrapper[4763]: I1006 16:26:42.129227 4763 generic.go:334] "Generic (PLEG): container finished" podID="1317520f-29bf-4615-b385-b03a9ffa898e" containerID="16b1086aa399df279da15b3b684b672230925ed0d17ac793a0e91a03333f299b" exitCode=0 Oct 06 16:26:42 crc kubenswrapper[4763]: I1006 16:26:42.129260 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" event={"ID":"1317520f-29bf-4615-b385-b03a9ffa898e","Type":"ContainerDied","Data":"16b1086aa399df279da15b3b684b672230925ed0d17ac793a0e91a03333f299b"} Oct 06 16:26:42 crc kubenswrapper[4763]: I1006 16:26:42.129281 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" event={"ID":"1317520f-29bf-4615-b385-b03a9ffa898e","Type":"ContainerStarted","Data":"2d867ef64682c7d96f31fc3519e27ad901f04b9738ad8870b33cc69c7254cdee"} Oct 06 16:26:43 crc kubenswrapper[4763]: I1006 16:26:43.142984 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eeabf008-f00e-4394-a691-affd62647290","Type":"ContainerStarted","Data":"dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d"} Oct 06 16:26:43 crc kubenswrapper[4763]: I1006 16:26:43.143471 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 16:26:43 crc kubenswrapper[4763]: I1006 16:26:43.147883 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" event={"ID":"1317520f-29bf-4615-b385-b03a9ffa898e","Type":"ContainerStarted","Data":"ae1c758d43eec4438ddc622373d0efdbdad91508e76b510e36922681a2272269"} Oct 06 16:26:43 crc kubenswrapper[4763]: I1006 16:26:43.148243 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:43 crc kubenswrapper[4763]: I1006 16:26:43.171565 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.171527498 podStartE2EDuration="3.171527498s" podCreationTimestamp="2025-10-06 16:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:26:43.16899898 +0000 UTC m=+5600.324291502" watchObservedRunningTime="2025-10-06 16:26:43.171527498 +0000 UTC m=+5600.326820060" Oct 06 16:26:43 crc kubenswrapper[4763]: I1006 16:26:43.201890 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" podStartSLOduration=3.201875091 podStartE2EDuration="3.201875091s" podCreationTimestamp="2025-10-06 16:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:26:43.197088933 +0000 UTC m=+5600.352381445" watchObservedRunningTime="2025-10-06 16:26:43.201875091 +0000 UTC m=+5600.357167603" Oct 06 16:26:44 crc kubenswrapper[4763]: I1006 16:26:44.575581 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:26:45 crc kubenswrapper[4763]: I1006 16:26:45.167007 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"18239693b1bebb33d80ff60a66ab6865889de136729fdb80a6e4fbf4547b7272"} Oct 06 16:26:50 crc kubenswrapper[4763]: I1006 16:26:50.738882 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:26:50 crc kubenswrapper[4763]: I1006 16:26:50.825908 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-c5zvs"] Oct 06 16:26:50 crc kubenswrapper[4763]: I1006 16:26:50.826266 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" podUID="a9c692a9-f907-437b-a5cf-51e48a5450e2" containerName="dnsmasq-dns" containerID="cri-o://6af3a4b3a9ce91a918c25d0add4ff1fc23a2b63506269855094986a350dc9dc1" gracePeriod=10 Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.240836 4763 generic.go:334] "Generic (PLEG): container finished" podID="a9c692a9-f907-437b-a5cf-51e48a5450e2" containerID="6af3a4b3a9ce91a918c25d0add4ff1fc23a2b63506269855094986a350dc9dc1" exitCode=0 Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.241059 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" event={"ID":"a9c692a9-f907-437b-a5cf-51e48a5450e2","Type":"ContainerDied","Data":"6af3a4b3a9ce91a918c25d0add4ff1fc23a2b63506269855094986a350dc9dc1"} Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.400051 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.547785 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-dns-svc\") pod \"a9c692a9-f907-437b-a5cf-51e48a5450e2\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.547836 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-nb\") pod \"a9c692a9-f907-437b-a5cf-51e48a5450e2\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.547875 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-sb\") pod \"a9c692a9-f907-437b-a5cf-51e48a5450e2\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.547899 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-475rp\" (UniqueName: \"kubernetes.io/projected/a9c692a9-f907-437b-a5cf-51e48a5450e2-kube-api-access-475rp\") pod \"a9c692a9-f907-437b-a5cf-51e48a5450e2\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.547949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-config\") pod \"a9c692a9-f907-437b-a5cf-51e48a5450e2\" (UID: \"a9c692a9-f907-437b-a5cf-51e48a5450e2\") " Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.564790 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c692a9-f907-437b-a5cf-51e48a5450e2-kube-api-access-475rp" (OuterVolumeSpecName: "kube-api-access-475rp") pod "a9c692a9-f907-437b-a5cf-51e48a5450e2" (UID: "a9c692a9-f907-437b-a5cf-51e48a5450e2"). InnerVolumeSpecName "kube-api-access-475rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.616193 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9c692a9-f907-437b-a5cf-51e48a5450e2" (UID: "a9c692a9-f907-437b-a5cf-51e48a5450e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.630084 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a9c692a9-f907-437b-a5cf-51e48a5450e2" (UID: "a9c692a9-f907-437b-a5cf-51e48a5450e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.634333 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-config" (OuterVolumeSpecName: "config") pod "a9c692a9-f907-437b-a5cf-51e48a5450e2" (UID: "a9c692a9-f907-437b-a5cf-51e48a5450e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.650474 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.650515 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.650529 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-475rp\" (UniqueName: \"kubernetes.io/projected/a9c692a9-f907-437b-a5cf-51e48a5450e2-kube-api-access-475rp\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.650543 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.652781 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a9c692a9-f907-437b-a5cf-51e48a5450e2" (UID: "a9c692a9-f907-437b-a5cf-51e48a5450e2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:26:51 crc kubenswrapper[4763]: I1006 16:26:51.752794 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9c692a9-f907-437b-a5cf-51e48a5450e2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.255079 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" event={"ID":"a9c692a9-f907-437b-a5cf-51e48a5450e2","Type":"ContainerDied","Data":"3f44e3bbcdde85abf1e3fa1cc68d11cf6d1ae6f189aac5557626240535fa3840"} Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.255147 4763 scope.go:117] "RemoveContainer" containerID="6af3a4b3a9ce91a918c25d0add4ff1fc23a2b63506269855094986a350dc9dc1" Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.255410 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c7886d8f-c5zvs" Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.290562 4763 scope.go:117] "RemoveContainer" containerID="cd25a07340c1b4d1e77a502ea88783280cf8e7daf1d40a8383951fb9fa4f3edc" Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.306944 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-c5zvs"] Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.316970 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-c5zvs"] Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.801340 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.801669 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-log" containerID="cri-o://077e2a22de54d1eb090b9182c659466749abb5d721bfedc293afdab7753a7ef5" gracePeriod=30 Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.801790 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-metadata" containerID="cri-o://7303d1a0cfb0b2bf8c33f9046e072c23106bf7cea6c6b2d15cee798002c214c1" gracePeriod=30 Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.820766 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.821214 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerName="nova-api-api" containerID="cri-o://a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525" gracePeriod=30 Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.822736 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerName="nova-api-log" containerID="cri-o://7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767" gracePeriod=30 Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.837088 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.837366 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="87b7f703-6648-461e-8edb-63d1351c3cbb" containerName="nova-scheduler-scheduler" containerID="cri-o://b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e" gracePeriod=30 Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.849343 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.849746 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d856fcef-f614-4291-818c-3f77282f5940" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e" gracePeriod=30 Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.865081 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.865318 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="9ed10075-b98c-4856-9515-d2ee60040058" containerName="nova-cell0-conductor-conductor" containerID="cri-o://09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c" gracePeriod=30 Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.931220 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 16:26:52 crc kubenswrapper[4763]: I1006 16:26:52.931823 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f85f902d-6b8b-4847-8581-b6a42fcc875e" containerName="nova-cell1-conductor-conductor" containerID="cri-o://16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa" gracePeriod=30 Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.067577 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.269329 4763 generic.go:334] "Generic (PLEG): container finished" podID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerID="7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767" exitCode=143 Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.269446 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36320b5b-5ae6-4e50-af8e-b8dca958b12e","Type":"ContainerDied","Data":"7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767"} Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.273801 4763 generic.go:334] "Generic (PLEG): container finished" podID="6856d20c-96f8-415b-a86a-1231966beb28" containerID="077e2a22de54d1eb090b9182c659466749abb5d721bfedc293afdab7753a7ef5" exitCode=143 Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.273862 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6856d20c-96f8-415b-a86a-1231966beb28","Type":"ContainerDied","Data":"077e2a22de54d1eb090b9182c659466749abb5d721bfedc293afdab7753a7ef5"} Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.588715 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c692a9-f907-437b-a5cf-51e48a5450e2" path="/var/lib/kubelet/pods/a9c692a9-f907-437b-a5cf-51e48a5450e2/volumes" Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.712493 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.805265 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-config-data\") pod \"d856fcef-f614-4291-818c-3f77282f5940\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.805424 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-combined-ca-bundle\") pod \"d856fcef-f614-4291-818c-3f77282f5940\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.805518 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkqxw\" (UniqueName: \"kubernetes.io/projected/d856fcef-f614-4291-818c-3f77282f5940-kube-api-access-lkqxw\") pod \"d856fcef-f614-4291-818c-3f77282f5940\" (UID: \"d856fcef-f614-4291-818c-3f77282f5940\") " Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.812795 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d856fcef-f614-4291-818c-3f77282f5940-kube-api-access-lkqxw" (OuterVolumeSpecName: "kube-api-access-lkqxw") pod "d856fcef-f614-4291-818c-3f77282f5940" (UID: "d856fcef-f614-4291-818c-3f77282f5940"). InnerVolumeSpecName "kube-api-access-lkqxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.843864 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-config-data" (OuterVolumeSpecName: "config-data") pod "d856fcef-f614-4291-818c-3f77282f5940" (UID: "d856fcef-f614-4291-818c-3f77282f5940"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:53 crc kubenswrapper[4763]: E1006 16:26:53.854913 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 16:26:53 crc kubenswrapper[4763]: E1006 16:26:53.856494 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 16:26:53 crc kubenswrapper[4763]: E1006 16:26:53.857843 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 16:26:53 crc kubenswrapper[4763]: E1006 16:26:53.857885 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="87b7f703-6648-461e-8edb-63d1351c3cbb" containerName="nova-scheduler-scheduler" Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.872334 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d856fcef-f614-4291-818c-3f77282f5940" (UID: "d856fcef-f614-4291-818c-3f77282f5940"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.907221 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.907252 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d856fcef-f614-4291-818c-3f77282f5940-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:53 crc kubenswrapper[4763]: I1006 16:26:53.907265 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkqxw\" (UniqueName: \"kubernetes.io/projected/d856fcef-f614-4291-818c-3f77282f5940-kube-api-access-lkqxw\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.283135 4763 generic.go:334] "Generic (PLEG): container finished" podID="d856fcef-f614-4291-818c-3f77282f5940" containerID="efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e" exitCode=0 Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.283176 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d856fcef-f614-4291-818c-3f77282f5940","Type":"ContainerDied","Data":"efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e"} Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.283202 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d856fcef-f614-4291-818c-3f77282f5940","Type":"ContainerDied","Data":"48587decd952c43bf3bc1bdd1430087e2947358079a04e0bf158498b0b0a5015"} Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.283219 4763 scope.go:117] "RemoveContainer" containerID="efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.283321 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.337542 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.344865 4763 scope.go:117] "RemoveContainer" containerID="efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e" Oct 06 16:26:54 crc kubenswrapper[4763]: E1006 16:26:54.345833 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e\": container with ID starting with efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e not found: ID does not exist" containerID="efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.345877 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e"} err="failed to get container status \"efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e\": rpc error: code = NotFound desc = could not find container \"efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e\": container with ID starting with efb01d5374fa860e235086bac80cb3ed8e55a0519d755e13c6375c344715762e not found: ID does not exist" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.362327 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.376765 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 16:26:54 crc kubenswrapper[4763]: E1006 16:26:54.377312 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c692a9-f907-437b-a5cf-51e48a5450e2" containerName="dnsmasq-dns" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.377339 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c692a9-f907-437b-a5cf-51e48a5450e2" containerName="dnsmasq-dns" Oct 06 16:26:54 crc kubenswrapper[4763]: E1006 16:26:54.377368 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d856fcef-f614-4291-818c-3f77282f5940" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.377377 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d856fcef-f614-4291-818c-3f77282f5940" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 16:26:54 crc kubenswrapper[4763]: E1006 16:26:54.377394 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c692a9-f907-437b-a5cf-51e48a5450e2" containerName="init" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.377402 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c692a9-f907-437b-a5cf-51e48a5450e2" containerName="init" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.377643 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d856fcef-f614-4291-818c-3f77282f5940" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.377673 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c692a9-f907-437b-a5cf-51e48a5450e2" containerName="dnsmasq-dns" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.378458 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.382683 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.384629 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.517674 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21813bff-a137-4a98-b0fd-f5e639425cba-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21813bff-a137-4a98-b0fd-f5e639425cba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.517742 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhnw6\" (UniqueName: \"kubernetes.io/projected/21813bff-a137-4a98-b0fd-f5e639425cba-kube-api-access-xhnw6\") pod \"nova-cell1-novncproxy-0\" (UID: \"21813bff-a137-4a98-b0fd-f5e639425cba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.517922 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21813bff-a137-4a98-b0fd-f5e639425cba-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21813bff-a137-4a98-b0fd-f5e639425cba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.619243 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21813bff-a137-4a98-b0fd-f5e639425cba-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21813bff-a137-4a98-b0fd-f5e639425cba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.619304 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhnw6\" (UniqueName: \"kubernetes.io/projected/21813bff-a137-4a98-b0fd-f5e639425cba-kube-api-access-xhnw6\") pod \"nova-cell1-novncproxy-0\" (UID: \"21813bff-a137-4a98-b0fd-f5e639425cba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.619357 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21813bff-a137-4a98-b0fd-f5e639425cba-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21813bff-a137-4a98-b0fd-f5e639425cba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.633409 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21813bff-a137-4a98-b0fd-f5e639425cba-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21813bff-a137-4a98-b0fd-f5e639425cba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.639175 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21813bff-a137-4a98-b0fd-f5e639425cba-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21813bff-a137-4a98-b0fd-f5e639425cba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.643126 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhnw6\" (UniqueName: \"kubernetes.io/projected/21813bff-a137-4a98-b0fd-f5e639425cba-kube-api-access-xhnw6\") pod \"nova-cell1-novncproxy-0\" (UID: \"21813bff-a137-4a98-b0fd-f5e639425cba\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:54 crc kubenswrapper[4763]: I1006 16:26:54.703278 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.205070 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.212336 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.295791 4763 generic.go:334] "Generic (PLEG): container finished" podID="9ed10075-b98c-4856-9515-d2ee60040058" containerID="09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c" exitCode=0 Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.295847 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.296068 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9ed10075-b98c-4856-9515-d2ee60040058","Type":"ContainerDied","Data":"09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c"} Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.296283 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9ed10075-b98c-4856-9515-d2ee60040058","Type":"ContainerDied","Data":"c27924aa5eb68d57e7dbf19fca4b7f6032f251339dbbed5969739667e94fb254"} Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.296394 4763 scope.go:117] "RemoveContainer" containerID="09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.301937 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21813bff-a137-4a98-b0fd-f5e639425cba","Type":"ContainerStarted","Data":"263260dc3bebcf4a77f9a392f0b4e8c4ff0d8c727952cd551e06290375fe86d1"} Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.332702 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwn9m\" (UniqueName: \"kubernetes.io/projected/9ed10075-b98c-4856-9515-d2ee60040058-kube-api-access-nwn9m\") pod \"9ed10075-b98c-4856-9515-d2ee60040058\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.333149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-combined-ca-bundle\") pod \"9ed10075-b98c-4856-9515-d2ee60040058\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.333415 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-config-data\") pod \"9ed10075-b98c-4856-9515-d2ee60040058\" (UID: \"9ed10075-b98c-4856-9515-d2ee60040058\") " Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.337077 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed10075-b98c-4856-9515-d2ee60040058-kube-api-access-nwn9m" (OuterVolumeSpecName: "kube-api-access-nwn9m") pod "9ed10075-b98c-4856-9515-d2ee60040058" (UID: "9ed10075-b98c-4856-9515-d2ee60040058"). InnerVolumeSpecName "kube-api-access-nwn9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.358469 4763 scope.go:117] "RemoveContainer" containerID="09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c" Oct 06 16:26:55 crc kubenswrapper[4763]: E1006 16:26:55.359064 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c\": container with ID starting with 09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c not found: ID does not exist" containerID="09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.359088 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c"} err="failed to get container status \"09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c\": rpc error: code = NotFound desc = could not find container \"09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c\": container with ID starting with 09ca1b131c24717584bf0022ac8f4a1744b7a4130775eeff58f023e460b8220c not found: ID does not exist" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.363723 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-config-data" (OuterVolumeSpecName: "config-data") pod "9ed10075-b98c-4856-9515-d2ee60040058" (UID: "9ed10075-b98c-4856-9515-d2ee60040058"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.368089 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ed10075-b98c-4856-9515-d2ee60040058" (UID: "9ed10075-b98c-4856-9515-d2ee60040058"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.435557 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.435603 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwn9m\" (UniqueName: \"kubernetes.io/projected/9ed10075-b98c-4856-9515-d2ee60040058-kube-api-access-nwn9m\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.435631 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed10075-b98c-4856-9515-d2ee60040058-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:55 crc kubenswrapper[4763]: E1006 16:26:55.534217 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 16:26:55 crc kubenswrapper[4763]: E1006 16:26:55.536133 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 16:26:55 crc kubenswrapper[4763]: E1006 16:26:55.537270 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 16:26:55 crc kubenswrapper[4763]: E1006 16:26:55.537311 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f85f902d-6b8b-4847-8581-b6a42fcc875e" containerName="nova-cell1-conductor-conductor" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.590436 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d856fcef-f614-4291-818c-3f77282f5940" path="/var/lib/kubelet/pods/d856fcef-f614-4291-818c-3f77282f5940/volumes" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.632669 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.642319 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.656533 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 16:26:55 crc kubenswrapper[4763]: E1006 16:26:55.656942 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed10075-b98c-4856-9515-d2ee60040058" containerName="nova-cell0-conductor-conductor" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.656963 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed10075-b98c-4856-9515-d2ee60040058" containerName="nova-cell0-conductor-conductor" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.657185 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed10075-b98c-4856-9515-d2ee60040058" containerName="nova-cell0-conductor-conductor" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.657836 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.662390 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.671979 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.841871 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1fed64c-1c98-4def-9138-53394c8a7181-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c1fed64c-1c98-4def-9138-53394c8a7181\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.842187 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fed64c-1c98-4def-9138-53394c8a7181-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c1fed64c-1c98-4def-9138-53394c8a7181\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.842388 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9hnd\" (UniqueName: \"kubernetes.io/projected/c1fed64c-1c98-4def-9138-53394c8a7181-kube-api-access-r9hnd\") pod \"nova-cell0-conductor-0\" (UID: \"c1fed64c-1c98-4def-9138-53394c8a7181\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.944480 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1fed64c-1c98-4def-9138-53394c8a7181-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c1fed64c-1c98-4def-9138-53394c8a7181\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.944890 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fed64c-1c98-4def-9138-53394c8a7181-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c1fed64c-1c98-4def-9138-53394c8a7181\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.945078 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9hnd\" (UniqueName: \"kubernetes.io/projected/c1fed64c-1c98-4def-9138-53394c8a7181-kube-api-access-r9hnd\") pod \"nova-cell0-conductor-0\" (UID: \"c1fed64c-1c98-4def-9138-53394c8a7181\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.948937 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fed64c-1c98-4def-9138-53394c8a7181-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c1fed64c-1c98-4def-9138-53394c8a7181\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.950380 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1fed64c-1c98-4def-9138-53394c8a7181-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c1fed64c-1c98-4def-9138-53394c8a7181\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.965872 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": read tcp 10.217.0.2:56938->10.217.1.71:8775: read: connection reset by peer" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.965932 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": read tcp 10.217.0.2:56930->10.217.1.71:8775: read: connection reset by peer" Oct 06 16:26:55 crc kubenswrapper[4763]: I1006 16:26:55.973860 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9hnd\" (UniqueName: \"kubernetes.io/projected/c1fed64c-1c98-4def-9138-53394c8a7181-kube-api-access-r9hnd\") pod \"nova-cell0-conductor-0\" (UID: \"c1fed64c-1c98-4def-9138-53394c8a7181\") " pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.273405 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.325920 4763 generic.go:334] "Generic (PLEG): container finished" podID="6856d20c-96f8-415b-a86a-1231966beb28" containerID="7303d1a0cfb0b2bf8c33f9046e072c23106bf7cea6c6b2d15cee798002c214c1" exitCode=0 Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.325992 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6856d20c-96f8-415b-a86a-1231966beb28","Type":"ContainerDied","Data":"7303d1a0cfb0b2bf8c33f9046e072c23106bf7cea6c6b2d15cee798002c214c1"} Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.332257 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21813bff-a137-4a98-b0fd-f5e639425cba","Type":"ContainerStarted","Data":"4b64dbc58af865647d5f95cd3f507c26373bb33bdda84cbffade47c06974d328"} Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.351724 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.35170564 podStartE2EDuration="2.35170564s" podCreationTimestamp="2025-10-06 16:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:26:56.350862508 +0000 UTC m=+5613.506155030" watchObservedRunningTime="2025-10-06 16:26:56.35170564 +0000 UTC m=+5613.506998152" Oct 06 16:26:56 crc kubenswrapper[4763]: E1006 16:26:56.451564 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36320b5b_5ae6_4e50_af8e_b8dca958b12e.slice/crio-a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525.scope\": RecentStats: unable to find data in memory cache]" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.508042 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.660769 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-config-data\") pod \"6856d20c-96f8-415b-a86a-1231966beb28\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.661029 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdjwr\" (UniqueName: \"kubernetes.io/projected/6856d20c-96f8-415b-a86a-1231966beb28-kube-api-access-rdjwr\") pod \"6856d20c-96f8-415b-a86a-1231966beb28\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.661176 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-combined-ca-bundle\") pod \"6856d20c-96f8-415b-a86a-1231966beb28\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.661268 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6856d20c-96f8-415b-a86a-1231966beb28-logs\") pod \"6856d20c-96f8-415b-a86a-1231966beb28\" (UID: \"6856d20c-96f8-415b-a86a-1231966beb28\") " Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.663318 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6856d20c-96f8-415b-a86a-1231966beb28-logs" (OuterVolumeSpecName: "logs") pod "6856d20c-96f8-415b-a86a-1231966beb28" (UID: "6856d20c-96f8-415b-a86a-1231966beb28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.684813 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6856d20c-96f8-415b-a86a-1231966beb28-kube-api-access-rdjwr" (OuterVolumeSpecName: "kube-api-access-rdjwr") pod "6856d20c-96f8-415b-a86a-1231966beb28" (UID: "6856d20c-96f8-415b-a86a-1231966beb28"). InnerVolumeSpecName "kube-api-access-rdjwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.706887 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6856d20c-96f8-415b-a86a-1231966beb28" (UID: "6856d20c-96f8-415b-a86a-1231966beb28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.732123 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-config-data" (OuterVolumeSpecName: "config-data") pod "6856d20c-96f8-415b-a86a-1231966beb28" (UID: "6856d20c-96f8-415b-a86a-1231966beb28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.765047 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.765188 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdjwr\" (UniqueName: \"kubernetes.io/projected/6856d20c-96f8-415b-a86a-1231966beb28-kube-api-access-rdjwr\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.765247 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6856d20c-96f8-415b-a86a-1231966beb28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.765299 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6856d20c-96f8-415b-a86a-1231966beb28-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.808836 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.867344 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 16:26:56 crc kubenswrapper[4763]: W1006 16:26:56.881479 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1fed64c_1c98_4def_9138_53394c8a7181.slice/crio-bf960113083c217d63387ad113655f6b7d0b34812e499088a23db2c80284d8f5 WatchSource:0}: Error finding container bf960113083c217d63387ad113655f6b7d0b34812e499088a23db2c80284d8f5: Status 404 returned error can't find the container with id bf960113083c217d63387ad113655f6b7d0b34812e499088a23db2c80284d8f5 Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.968539 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-config-data\") pod \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.968589 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36320b5b-5ae6-4e50-af8e-b8dca958b12e-logs\") pod \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.968682 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twncb\" (UniqueName: \"kubernetes.io/projected/36320b5b-5ae6-4e50-af8e-b8dca958b12e-kube-api-access-twncb\") pod \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.968813 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-combined-ca-bundle\") pod \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\" (UID: \"36320b5b-5ae6-4e50-af8e-b8dca958b12e\") " Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.969429 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36320b5b-5ae6-4e50-af8e-b8dca958b12e-logs" (OuterVolumeSpecName: "logs") pod "36320b5b-5ae6-4e50-af8e-b8dca958b12e" (UID: "36320b5b-5ae6-4e50-af8e-b8dca958b12e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.969527 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36320b5b-5ae6-4e50-af8e-b8dca958b12e-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:56 crc kubenswrapper[4763]: I1006 16:26:56.972908 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36320b5b-5ae6-4e50-af8e-b8dca958b12e-kube-api-access-twncb" (OuterVolumeSpecName: "kube-api-access-twncb") pod "36320b5b-5ae6-4e50-af8e-b8dca958b12e" (UID: "36320b5b-5ae6-4e50-af8e-b8dca958b12e"). InnerVolumeSpecName "kube-api-access-twncb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.009432 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36320b5b-5ae6-4e50-af8e-b8dca958b12e" (UID: "36320b5b-5ae6-4e50-af8e-b8dca958b12e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.023067 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-config-data" (OuterVolumeSpecName: "config-data") pod "36320b5b-5ae6-4e50-af8e-b8dca958b12e" (UID: "36320b5b-5ae6-4e50-af8e-b8dca958b12e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.070961 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.070999 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36320b5b-5ae6-4e50-af8e-b8dca958b12e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.071009 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twncb\" (UniqueName: \"kubernetes.io/projected/36320b5b-5ae6-4e50-af8e-b8dca958b12e-kube-api-access-twncb\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.345886 4763 generic.go:334] "Generic (PLEG): container finished" podID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerID="a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525" exitCode=0 Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.345980 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36320b5b-5ae6-4e50-af8e-b8dca958b12e","Type":"ContainerDied","Data":"a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525"} Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.346187 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36320b5b-5ae6-4e50-af8e-b8dca958b12e","Type":"ContainerDied","Data":"8b507e8c726fe686527993690ff8d592fc214d0d12501e5eb50a0f365a099684"} Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.346205 4763 scope.go:117] "RemoveContainer" containerID="a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.346482 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.351475 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.351513 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6856d20c-96f8-415b-a86a-1231966beb28","Type":"ContainerDied","Data":"2ea31026be0a9252493b9e4cb07d17781bc236a3f3620ff91fe9a338ac6383e7"} Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.365122 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c1fed64c-1c98-4def-9138-53394c8a7181","Type":"ContainerStarted","Data":"baa01cfccb0a4be6498fd0d9ceefb3452eba27772f44582614fa0c80ffef94ee"} Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.365190 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.365222 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c1fed64c-1c98-4def-9138-53394c8a7181","Type":"ContainerStarted","Data":"bf960113083c217d63387ad113655f6b7d0b34812e499088a23db2c80284d8f5"} Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.369717 4763 scope.go:117] "RemoveContainer" containerID="7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.400593 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.400563482 podStartE2EDuration="2.400563482s" podCreationTimestamp="2025-10-06 16:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:26:57.3833233 +0000 UTC m=+5614.538615812" watchObservedRunningTime="2025-10-06 16:26:57.400563482 +0000 UTC m=+5614.555855994" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.428298 4763 scope.go:117] "RemoveContainer" containerID="a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525" Oct 06 16:26:57 crc kubenswrapper[4763]: E1006 16:26:57.429574 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525\": container with ID starting with a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525 not found: ID does not exist" containerID="a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.429643 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525"} err="failed to get container status \"a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525\": rpc error: code = NotFound desc = could not find container \"a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525\": container with ID starting with a8e2e5b4fba4480a57bdea4dc9cae9536143d7c534f9f363bf7fba8c18518525 not found: ID does not exist" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.429669 4763 scope.go:117] "RemoveContainer" containerID="7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767" Oct 06 16:26:57 crc kubenswrapper[4763]: E1006 16:26:57.430974 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767\": container with ID starting with 7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767 not found: ID does not exist" containerID="7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.431003 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767"} err="failed to get container status \"7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767\": rpc error: code = NotFound desc = could not find container \"7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767\": container with ID starting with 7fa0926fb0ae1c0324bc1f54bd30474ab168705e61cc525418b0345ae07d7767 not found: ID does not exist" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.431020 4763 scope.go:117] "RemoveContainer" containerID="7303d1a0cfb0b2bf8c33f9046e072c23106bf7cea6c6b2d15cee798002c214c1" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.440211 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.466521 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.474557 4763 scope.go:117] "RemoveContainer" containerID="077e2a22de54d1eb090b9182c659466749abb5d721bfedc293afdab7753a7ef5" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.477379 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.497748 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 16:26:57 crc kubenswrapper[4763]: E1006 16:26:57.498176 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerName="nova-api-log" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.498193 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerName="nova-api-log" Oct 06 16:26:57 crc kubenswrapper[4763]: E1006 16:26:57.498220 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-metadata" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.498227 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-metadata" Oct 06 16:26:57 crc kubenswrapper[4763]: E1006 16:26:57.498239 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerName="nova-api-api" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.498246 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerName="nova-api-api" Oct 06 16:26:57 crc kubenswrapper[4763]: E1006 16:26:57.498257 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-log" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.498263 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-log" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.498455 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerName="nova-api-api" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.498470 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-log" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.498484 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" containerName="nova-api-log" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.498494 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6856d20c-96f8-415b-a86a-1231966beb28" containerName="nova-metadata-metadata" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.499482 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.502652 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.516768 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.525019 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.535671 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.537631 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.541372 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.542443 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.579644 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a00106a0-435a-4e68-854d-a810d4113012-logs\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.579726 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00106a0-435a-4e68-854d-a810d4113012-config-data\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.579821 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00106a0-435a-4e68-854d-a810d4113012-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.579942 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5xj4\" (UniqueName: \"kubernetes.io/projected/a00106a0-435a-4e68-854d-a810d4113012-kube-api-access-k5xj4\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.608535 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36320b5b-5ae6-4e50-af8e-b8dca958b12e" path="/var/lib/kubelet/pods/36320b5b-5ae6-4e50-af8e-b8dca958b12e/volumes" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.610677 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6856d20c-96f8-415b-a86a-1231966beb28" path="/var/lib/kubelet/pods/6856d20c-96f8-415b-a86a-1231966beb28/volumes" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.611818 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ed10075-b98c-4856-9515-d2ee60040058" path="/var/lib/kubelet/pods/9ed10075-b98c-4856-9515-d2ee60040058/volumes" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.682142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5xj4\" (UniqueName: \"kubernetes.io/projected/a00106a0-435a-4e68-854d-a810d4113012-kube-api-access-k5xj4\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.682255 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-logs\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.682283 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a00106a0-435a-4e68-854d-a810d4113012-logs\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.682302 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v46vp\" (UniqueName: \"kubernetes.io/projected/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-kube-api-access-v46vp\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.684130 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00106a0-435a-4e68-854d-a810d4113012-config-data\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.684197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.684234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00106a0-435a-4e68-854d-a810d4113012-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.684307 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-config-data\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.684349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a00106a0-435a-4e68-854d-a810d4113012-logs\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.690197 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00106a0-435a-4e68-854d-a810d4113012-config-data\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.692334 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00106a0-435a-4e68-854d-a810d4113012-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.711957 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5xj4\" (UniqueName: \"kubernetes.io/projected/a00106a0-435a-4e68-854d-a810d4113012-kube-api-access-k5xj4\") pod \"nova-api-0\" (UID: \"a00106a0-435a-4e68-854d-a810d4113012\") " pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.785752 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.785826 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-config-data\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.785940 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-logs\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.785977 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46vp\" (UniqueName: \"kubernetes.io/projected/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-kube-api-access-v46vp\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.787988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-logs\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.793889 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-config-data\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.801669 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.810218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v46vp\" (UniqueName: \"kubernetes.io/projected/054b7d7c-25d6-4c49-bda3-b52a32ea12a0-kube-api-access-v46vp\") pod \"nova-metadata-0\" (UID: \"054b7d7c-25d6-4c49-bda3-b52a32ea12a0\") " pod="openstack/nova-metadata-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.831448 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 16:26:57 crc kubenswrapper[4763]: I1006 16:26:57.879712 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.296431 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.383517 4763 generic.go:334] "Generic (PLEG): container finished" podID="87b7f703-6648-461e-8edb-63d1351c3cbb" containerID="b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e" exitCode=0 Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.383564 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.383580 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87b7f703-6648-461e-8edb-63d1351c3cbb","Type":"ContainerDied","Data":"b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e"} Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.384006 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87b7f703-6648-461e-8edb-63d1351c3cbb","Type":"ContainerDied","Data":"f5b0221461d6a869ccbfbacabba2b9212add8e45593c7bf492f3c6fe56d9aade"} Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.384023 4763 scope.go:117] "RemoveContainer" containerID="b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.394750 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 16:26:58 crc kubenswrapper[4763]: W1006 16:26:58.400459 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda00106a0_435a_4e68_854d_a810d4113012.slice/crio-8ce4f4c14bd7e517c30bcf6ec2673e5f0b93753fc38384b11109bf38f3ce1a20 WatchSource:0}: Error finding container 8ce4f4c14bd7e517c30bcf6ec2673e5f0b93753fc38384b11109bf38f3ce1a20: Status 404 returned error can't find the container with id 8ce4f4c14bd7e517c30bcf6ec2673e5f0b93753fc38384b11109bf38f3ce1a20 Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.401036 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-config-data\") pod \"87b7f703-6648-461e-8edb-63d1351c3cbb\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.401122 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvg7x\" (UniqueName: \"kubernetes.io/projected/87b7f703-6648-461e-8edb-63d1351c3cbb-kube-api-access-zvg7x\") pod \"87b7f703-6648-461e-8edb-63d1351c3cbb\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.401299 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-combined-ca-bundle\") pod \"87b7f703-6648-461e-8edb-63d1351c3cbb\" (UID: \"87b7f703-6648-461e-8edb-63d1351c3cbb\") " Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.406819 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b7f703-6648-461e-8edb-63d1351c3cbb-kube-api-access-zvg7x" (OuterVolumeSpecName: "kube-api-access-zvg7x") pod "87b7f703-6648-461e-8edb-63d1351c3cbb" (UID: "87b7f703-6648-461e-8edb-63d1351c3cbb"). InnerVolumeSpecName "kube-api-access-zvg7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.410292 4763 scope.go:117] "RemoveContainer" containerID="b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e" Oct 06 16:26:58 crc kubenswrapper[4763]: E1006 16:26:58.411062 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e\": container with ID starting with b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e not found: ID does not exist" containerID="b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.411121 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e"} err="failed to get container status \"b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e\": rpc error: code = NotFound desc = could not find container \"b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e\": container with ID starting with b171c9fbe5f842bf1510d3de5d810bef8189d7b149475e966724ddaccccc059e not found: ID does not exist" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.426368 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87b7f703-6648-461e-8edb-63d1351c3cbb" (UID: "87b7f703-6648-461e-8edb-63d1351c3cbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.429442 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-config-data" (OuterVolumeSpecName: "config-data") pod "87b7f703-6648-461e-8edb-63d1351c3cbb" (UID: "87b7f703-6648-461e-8edb-63d1351c3cbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.466098 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.504085 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvg7x\" (UniqueName: \"kubernetes.io/projected/87b7f703-6648-461e-8edb-63d1351c3cbb-kube-api-access-zvg7x\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.504116 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.504126 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b7f703-6648-461e-8edb-63d1351c3cbb-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.722538 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.731686 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.743198 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:26:58 crc kubenswrapper[4763]: E1006 16:26:58.743668 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b7f703-6648-461e-8edb-63d1351c3cbb" containerName="nova-scheduler-scheduler" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.743691 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b7f703-6648-461e-8edb-63d1351c3cbb" containerName="nova-scheduler-scheduler" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.744960 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b7f703-6648-461e-8edb-63d1351c3cbb" containerName="nova-scheduler-scheduler" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.745707 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.749434 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.764786 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.910331 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdzmb\" (UniqueName: \"kubernetes.io/projected/3b4e3344-ba03-4156-9ce3-6b6c2f524c5f-kube-api-access-bdzmb\") pod \"nova-scheduler-0\" (UID: \"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f\") " pod="openstack/nova-scheduler-0" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.910712 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b4e3344-ba03-4156-9ce3-6b6c2f524c5f-config-data\") pod \"nova-scheduler-0\" (UID: \"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f\") " pod="openstack/nova-scheduler-0" Oct 06 16:26:58 crc kubenswrapper[4763]: I1006 16:26:58.910789 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4e3344-ba03-4156-9ce3-6b6c2f524c5f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f\") " pod="openstack/nova-scheduler-0" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.012811 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdzmb\" (UniqueName: \"kubernetes.io/projected/3b4e3344-ba03-4156-9ce3-6b6c2f524c5f-kube-api-access-bdzmb\") pod \"nova-scheduler-0\" (UID: \"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f\") " pod="openstack/nova-scheduler-0" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.012874 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b4e3344-ba03-4156-9ce3-6b6c2f524c5f-config-data\") pod \"nova-scheduler-0\" (UID: \"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f\") " pod="openstack/nova-scheduler-0" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.012929 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4e3344-ba03-4156-9ce3-6b6c2f524c5f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f\") " pod="openstack/nova-scheduler-0" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.017898 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4e3344-ba03-4156-9ce3-6b6c2f524c5f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f\") " pod="openstack/nova-scheduler-0" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.018490 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b4e3344-ba03-4156-9ce3-6b6c2f524c5f-config-data\") pod \"nova-scheduler-0\" (UID: \"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f\") " pod="openstack/nova-scheduler-0" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.031040 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdzmb\" (UniqueName: \"kubernetes.io/projected/3b4e3344-ba03-4156-9ce3-6b6c2f524c5f-kube-api-access-bdzmb\") pod \"nova-scheduler-0\" (UID: \"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f\") " pod="openstack/nova-scheduler-0" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.077276 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.400117 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"054b7d7c-25d6-4c49-bda3-b52a32ea12a0","Type":"ContainerStarted","Data":"dacfdcf6efc9c6873ed4d7881c45461b58bc2e19df57332bd0ffaf184e9b62e5"} Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.400540 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"054b7d7c-25d6-4c49-bda3-b52a32ea12a0","Type":"ContainerStarted","Data":"458fb5d8ba3e58db9859d5ddb7133b6706ee5bb90fdfe0652d4a389533818f7e"} Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.400569 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"054b7d7c-25d6-4c49-bda3-b52a32ea12a0","Type":"ContainerStarted","Data":"a6a6ec5a865c32ba399180a1061c30d94812d2da20291cb0890957f695707446"} Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.402049 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a00106a0-435a-4e68-854d-a810d4113012","Type":"ContainerStarted","Data":"89087ec0272c26a7239596a17e1269916597d750b0419a8977086c1e104a3adf"} Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.402095 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a00106a0-435a-4e68-854d-a810d4113012","Type":"ContainerStarted","Data":"94ee22111e7fe4f2301f1946796b587e8df5260c359a2758dd2ade0ac0fd5ecc"} Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.402108 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a00106a0-435a-4e68-854d-a810d4113012","Type":"ContainerStarted","Data":"8ce4f4c14bd7e517c30bcf6ec2673e5f0b93753fc38384b11109bf38f3ce1a20"} Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.421585 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.42156547 podStartE2EDuration="2.42156547s" podCreationTimestamp="2025-10-06 16:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:26:59.420073 +0000 UTC m=+5616.575365512" watchObservedRunningTime="2025-10-06 16:26:59.42156547 +0000 UTC m=+5616.576857982" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.443660 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.443642541 podStartE2EDuration="2.443642541s" podCreationTimestamp="2025-10-06 16:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:26:59.438714229 +0000 UTC m=+5616.594006761" watchObservedRunningTime="2025-10-06 16:26:59.443642541 +0000 UTC m=+5616.598935053" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.557558 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 16:26:59 crc kubenswrapper[4763]: W1006 16:26:59.561201 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b4e3344_ba03_4156_9ce3_6b6c2f524c5f.slice/crio-f167c139caaca4618c44abbefd833b2c0cfa1fd88bd8322d20b6ef1058b3934a WatchSource:0}: Error finding container f167c139caaca4618c44abbefd833b2c0cfa1fd88bd8322d20b6ef1058b3934a: Status 404 returned error can't find the container with id f167c139caaca4618c44abbefd833b2c0cfa1fd88bd8322d20b6ef1058b3934a Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.602207 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87b7f703-6648-461e-8edb-63d1351c3cbb" path="/var/lib/kubelet/pods/87b7f703-6648-461e-8edb-63d1351c3cbb/volumes" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.703834 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.763739 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.844401 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfhv7\" (UniqueName: \"kubernetes.io/projected/f85f902d-6b8b-4847-8581-b6a42fcc875e-kube-api-access-dfhv7\") pod \"f85f902d-6b8b-4847-8581-b6a42fcc875e\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.844475 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-combined-ca-bundle\") pod \"f85f902d-6b8b-4847-8581-b6a42fcc875e\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.844640 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-config-data\") pod \"f85f902d-6b8b-4847-8581-b6a42fcc875e\" (UID: \"f85f902d-6b8b-4847-8581-b6a42fcc875e\") " Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.850729 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85f902d-6b8b-4847-8581-b6a42fcc875e-kube-api-access-dfhv7" (OuterVolumeSpecName: "kube-api-access-dfhv7") pod "f85f902d-6b8b-4847-8581-b6a42fcc875e" (UID: "f85f902d-6b8b-4847-8581-b6a42fcc875e"). InnerVolumeSpecName "kube-api-access-dfhv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.869578 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f85f902d-6b8b-4847-8581-b6a42fcc875e" (UID: "f85f902d-6b8b-4847-8581-b6a42fcc875e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.876119 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-config-data" (OuterVolumeSpecName: "config-data") pod "f85f902d-6b8b-4847-8581-b6a42fcc875e" (UID: "f85f902d-6b8b-4847-8581-b6a42fcc875e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.951330 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfhv7\" (UniqueName: \"kubernetes.io/projected/f85f902d-6b8b-4847-8581-b6a42fcc875e-kube-api-access-dfhv7\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.951398 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:26:59 crc kubenswrapper[4763]: I1006 16:26:59.951408 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85f902d-6b8b-4847-8581-b6a42fcc875e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.420273 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f","Type":"ContainerStarted","Data":"17f9d1ee3cf312fceb40f249e7c46efa926224b58963d08648eb33cdf4e75542"} Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.420314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3b4e3344-ba03-4156-9ce3-6b6c2f524c5f","Type":"ContainerStarted","Data":"f167c139caaca4618c44abbefd833b2c0cfa1fd88bd8322d20b6ef1058b3934a"} Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.422239 4763 generic.go:334] "Generic (PLEG): container finished" podID="f85f902d-6b8b-4847-8581-b6a42fcc875e" containerID="16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa" exitCode=0 Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.422357 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.423288 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f85f902d-6b8b-4847-8581-b6a42fcc875e","Type":"ContainerDied","Data":"16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa"} Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.423340 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f85f902d-6b8b-4847-8581-b6a42fcc875e","Type":"ContainerDied","Data":"b05f6ad18b988fe29e1fed5823d12c7ad57953663034760528c1af04b3ef167f"} Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.423371 4763 scope.go:117] "RemoveContainer" containerID="16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.450545 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.450519429 podStartE2EDuration="2.450519429s" podCreationTimestamp="2025-10-06 16:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:27:00.450009135 +0000 UTC m=+5617.605301677" watchObservedRunningTime="2025-10-06 16:27:00.450519429 +0000 UTC m=+5617.605811981" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.467696 4763 scope.go:117] "RemoveContainer" containerID="16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa" Oct 06 16:27:00 crc kubenswrapper[4763]: E1006 16:27:00.468844 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa\": container with ID starting with 16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa not found: ID does not exist" containerID="16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.468882 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa"} err="failed to get container status \"16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa\": rpc error: code = NotFound desc = could not find container \"16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa\": container with ID starting with 16a28c2d8ab8a1aba88a8e36351079ca2b43120ab3b71aba1e634ed7f66d1ffa not found: ID does not exist" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.495928 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.508293 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.530526 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 16:27:00 crc kubenswrapper[4763]: E1006 16:27:00.531020 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85f902d-6b8b-4847-8581-b6a42fcc875e" containerName="nova-cell1-conductor-conductor" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.531045 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85f902d-6b8b-4847-8581-b6a42fcc875e" containerName="nova-cell1-conductor-conductor" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.531318 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85f902d-6b8b-4847-8581-b6a42fcc875e" containerName="nova-cell1-conductor-conductor" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.532197 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.534434 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.548829 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.673489 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v79s\" (UniqueName: \"kubernetes.io/projected/c8027b86-62d5-4eea-bbd4-8af38109f89f-kube-api-access-9v79s\") pod \"nova-cell1-conductor-0\" (UID: \"c8027b86-62d5-4eea-bbd4-8af38109f89f\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.674428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8027b86-62d5-4eea-bbd4-8af38109f89f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8027b86-62d5-4eea-bbd4-8af38109f89f\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.674958 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8027b86-62d5-4eea-bbd4-8af38109f89f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8027b86-62d5-4eea-bbd4-8af38109f89f\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.776335 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v79s\" (UniqueName: \"kubernetes.io/projected/c8027b86-62d5-4eea-bbd4-8af38109f89f-kube-api-access-9v79s\") pod \"nova-cell1-conductor-0\" (UID: \"c8027b86-62d5-4eea-bbd4-8af38109f89f\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.776440 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8027b86-62d5-4eea-bbd4-8af38109f89f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8027b86-62d5-4eea-bbd4-8af38109f89f\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.776540 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8027b86-62d5-4eea-bbd4-8af38109f89f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8027b86-62d5-4eea-bbd4-8af38109f89f\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.790606 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8027b86-62d5-4eea-bbd4-8af38109f89f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8027b86-62d5-4eea-bbd4-8af38109f89f\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.791293 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8027b86-62d5-4eea-bbd4-8af38109f89f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8027b86-62d5-4eea-bbd4-8af38109f89f\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.794965 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v79s\" (UniqueName: \"kubernetes.io/projected/c8027b86-62d5-4eea-bbd4-8af38109f89f-kube-api-access-9v79s\") pod \"nova-cell1-conductor-0\" (UID: \"c8027b86-62d5-4eea-bbd4-8af38109f89f\") " pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:00 crc kubenswrapper[4763]: I1006 16:27:00.853824 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:01 crc kubenswrapper[4763]: I1006 16:27:01.342878 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 16:27:01 crc kubenswrapper[4763]: I1006 16:27:01.438324 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c8027b86-62d5-4eea-bbd4-8af38109f89f","Type":"ContainerStarted","Data":"00de7dad3d5bd8c64862bbddf0cf53659c40b1e7c5c5bd852b15870f4946a251"} Oct 06 16:27:01 crc kubenswrapper[4763]: I1006 16:27:01.592513 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85f902d-6b8b-4847-8581-b6a42fcc875e" path="/var/lib/kubelet/pods/f85f902d-6b8b-4847-8581-b6a42fcc875e/volumes" Oct 06 16:27:02 crc kubenswrapper[4763]: I1006 16:27:02.450754 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c8027b86-62d5-4eea-bbd4-8af38109f89f","Type":"ContainerStarted","Data":"825b849d4f6dde95a8f0cb1b22532025d2c74b2fdb1ce4dfa026258701d8c8ab"} Oct 06 16:27:02 crc kubenswrapper[4763]: I1006 16:27:02.451393 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:02 crc kubenswrapper[4763]: I1006 16:27:02.474352 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.4743330820000002 podStartE2EDuration="2.474333082s" podCreationTimestamp="2025-10-06 16:27:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:27:02.469547684 +0000 UTC m=+5619.624840196" watchObservedRunningTime="2025-10-06 16:27:02.474333082 +0000 UTC m=+5619.629625594" Oct 06 16:27:02 crc kubenswrapper[4763]: I1006 16:27:02.880278 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 16:27:02 crc kubenswrapper[4763]: I1006 16:27:02.880675 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 16:27:04 crc kubenswrapper[4763]: I1006 16:27:04.079050 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 16:27:04 crc kubenswrapper[4763]: I1006 16:27:04.704111 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:27:04 crc kubenswrapper[4763]: I1006 16:27:04.726706 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:27:05 crc kubenswrapper[4763]: I1006 16:27:05.503683 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 06 16:27:06 crc kubenswrapper[4763]: I1006 16:27:06.298815 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 06 16:27:07 crc kubenswrapper[4763]: I1006 16:27:07.832563 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 16:27:07 crc kubenswrapper[4763]: I1006 16:27:07.832816 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 16:27:07 crc kubenswrapper[4763]: I1006 16:27:07.880897 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 16:27:07 crc kubenswrapper[4763]: I1006 16:27:07.881060 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 16:27:08 crc kubenswrapper[4763]: I1006 16:27:08.915829 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a00106a0-435a-4e68-854d-a810d4113012" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:27:08 crc kubenswrapper[4763]: I1006 16:27:08.915865 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a00106a0-435a-4e68-854d-a810d4113012" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:27:09 crc kubenswrapper[4763]: I1006 16:27:09.002295 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="054b7d7c-25d6-4c49-bda3-b52a32ea12a0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:27:09 crc kubenswrapper[4763]: I1006 16:27:09.002304 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="054b7d7c-25d6-4c49-bda3-b52a32ea12a0" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 16:27:09 crc kubenswrapper[4763]: I1006 16:27:09.078746 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 16:27:09 crc kubenswrapper[4763]: I1006 16:27:09.104263 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 16:27:09 crc kubenswrapper[4763]: I1006 16:27:09.610959 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 16:27:10 crc kubenswrapper[4763]: I1006 16:27:10.911246 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.306024 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.307687 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.309515 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.334271 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.335603 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bswxh\" (UniqueName: \"kubernetes.io/projected/e163584a-58e6-44c2-a5d2-a977558f6eff-kube-api-access-bswxh\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.335716 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e163584a-58e6-44c2-a5d2-a977558f6eff-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.335828 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.335877 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-scripts\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.335936 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.335969 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.437586 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-scripts\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.437675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.437758 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.437818 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bswxh\" (UniqueName: \"kubernetes.io/projected/e163584a-58e6-44c2-a5d2-a977558f6eff-kube-api-access-bswxh\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.437853 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e163584a-58e6-44c2-a5d2-a977558f6eff-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.437937 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.438051 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e163584a-58e6-44c2-a5d2-a977558f6eff-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.444889 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-scripts\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.445598 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.446167 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.446527 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.452865 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bswxh\" (UniqueName: \"kubernetes.io/projected/e163584a-58e6-44c2-a5d2-a977558f6eff-kube-api-access-bswxh\") pod \"cinder-scheduler-0\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:12 crc kubenswrapper[4763]: I1006 16:27:12.629391 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 16:27:13 crc kubenswrapper[4763]: I1006 16:27:13.148526 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 16:27:13 crc kubenswrapper[4763]: I1006 16:27:13.573681 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e163584a-58e6-44c2-a5d2-a977558f6eff","Type":"ContainerStarted","Data":"da0eff046a37132938b0e9fab281fe174f8f8aaefe19f80826d6e7563aada0a4"} Oct 06 16:27:13 crc kubenswrapper[4763]: I1006 16:27:13.828453 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 16:27:13 crc kubenswrapper[4763]: I1006 16:27:13.835441 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="eeabf008-f00e-4394-a691-affd62647290" containerName="cinder-api-log" containerID="cri-o://35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76" gracePeriod=30 Oct 06 16:27:13 crc kubenswrapper[4763]: I1006 16:27:13.835588 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="eeabf008-f00e-4394-a691-affd62647290" containerName="cinder-api" containerID="cri-o://dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d" gracePeriod=30 Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.338688 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.340856 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.345366 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.352391 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475373 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-run\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475420 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475458 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475487 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475524 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475551 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475581 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475710 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475776 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475797 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/91b0955f-bf72-48f7-86de-a50ce7701fc7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475860 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.475957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.476031 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.476080 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.476301 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.476368 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gxll\" (UniqueName: \"kubernetes.io/projected/91b0955f-bf72-48f7-86de-a50ce7701fc7-kube-api-access-7gxll\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578392 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578462 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578487 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578508 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/91b0955f-bf72-48f7-86de-a50ce7701fc7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578530 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578554 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578584 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578608 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578719 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578748 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gxll\" (UniqueName: \"kubernetes.io/projected/91b0955f-bf72-48f7-86de-a50ce7701fc7-kube-api-access-7gxll\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578774 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-run\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578796 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578826 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578853 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578887 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578908 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.578995 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.579271 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.579714 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.579724 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.579844 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.580046 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-run\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.580853 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.580924 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.583587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.583820 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91b0955f-bf72-48f7-86de-a50ce7701fc7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.585840 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/91b0955f-bf72-48f7-86de-a50ce7701fc7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.585936 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.587039 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.587826 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.588207 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b0955f-bf72-48f7-86de-a50ce7701fc7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.600011 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e163584a-58e6-44c2-a5d2-a977558f6eff","Type":"ContainerStarted","Data":"67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f"} Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.600075 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e163584a-58e6-44c2-a5d2-a977558f6eff","Type":"ContainerStarted","Data":"42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743"} Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.609468 4763 generic.go:334] "Generic (PLEG): container finished" podID="eeabf008-f00e-4394-a691-affd62647290" containerID="35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76" exitCode=143 Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.609498 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eeabf008-f00e-4394-a691-affd62647290","Type":"ContainerDied","Data":"35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76"} Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.629433 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gxll\" (UniqueName: \"kubernetes.io/projected/91b0955f-bf72-48f7-86de-a50ce7701fc7-kube-api-access-7gxll\") pod \"cinder-volume-volume1-0\" (UID: \"91b0955f-bf72-48f7-86de-a50ce7701fc7\") " pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.634148 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.634132136 podStartE2EDuration="2.634132136s" podCreationTimestamp="2025-10-06 16:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:27:14.628329581 +0000 UTC m=+5631.783622203" watchObservedRunningTime="2025-10-06 16:27:14.634132136 +0000 UTC m=+5631.789424648" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.671211 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.907923 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.910239 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.912748 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.924631 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989477 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-config-data\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989532 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-etc-nvme\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989558 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989579 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989604 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-scripts\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989697 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989725 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-dev\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989805 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989844 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb2dg\" (UniqueName: \"kubernetes.io/projected/090fd341-e9a5-4d12-8aae-271b6b421647-kube-api-access-hb2dg\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989889 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989966 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/090fd341-e9a5-4d12-8aae-271b6b421647-ceph\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.989995 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-config-data-custom\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.990034 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-lib-modules\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.990140 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-run\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:14 crc kubenswrapper[4763]: I1006 16:27:14.990189 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-sys\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092221 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-scripts\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092334 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092359 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-dev\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092426 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092467 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb2dg\" (UniqueName: \"kubernetes.io/projected/090fd341-e9a5-4d12-8aae-271b6b421647-kube-api-access-hb2dg\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092514 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/090fd341-e9a5-4d12-8aae-271b6b421647-ceph\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092597 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-config-data-custom\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092652 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-lib-modules\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092683 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-run\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092729 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-sys\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092754 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-config-data\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092777 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092798 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-etc-nvme\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.092820 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.093791 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-run\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.093814 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-lib-modules\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.093919 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-sys\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.093954 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.093978 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-dev\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.094032 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.094330 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.094445 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-etc-nvme\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.094524 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.094579 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/090fd341-e9a5-4d12-8aae-271b6b421647-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.099415 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/090fd341-e9a5-4d12-8aae-271b6b421647-ceph\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.099485 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.100331 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-config-data\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.103934 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-config-data-custom\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.103942 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090fd341-e9a5-4d12-8aae-271b6b421647-scripts\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.111240 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb2dg\" (UniqueName: \"kubernetes.io/projected/090fd341-e9a5-4d12-8aae-271b6b421647-kube-api-access-hb2dg\") pod \"cinder-backup-0\" (UID: \"090fd341-e9a5-4d12-8aae-271b6b421647\") " pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.241772 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.261715 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 06 16:27:15 crc kubenswrapper[4763]: W1006 16:27:15.271098 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91b0955f_bf72_48f7_86de_a50ce7701fc7.slice/crio-bf1e88e44e5e1c340791d04bf13147d31d77a46b45d49144be917fd0a9339cdb WatchSource:0}: Error finding container bf1e88e44e5e1c340791d04bf13147d31d77a46b45d49144be917fd0a9339cdb: Status 404 returned error can't find the container with id bf1e88e44e5e1c340791d04bf13147d31d77a46b45d49144be917fd0a9339cdb Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.276351 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.619285 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"91b0955f-bf72-48f7-86de-a50ce7701fc7","Type":"ContainerStarted","Data":"bf1e88e44e5e1c340791d04bf13147d31d77a46b45d49144be917fd0a9339cdb"} Oct 06 16:27:15 crc kubenswrapper[4763]: I1006 16:27:15.805764 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 06 16:27:16 crc kubenswrapper[4763]: I1006 16:27:16.656167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"91b0955f-bf72-48f7-86de-a50ce7701fc7","Type":"ContainerStarted","Data":"86877e7efb72c06e87ae7110887fa025abad1ac2c535579e38cae5be8a123a3f"} Oct 06 16:27:16 crc kubenswrapper[4763]: I1006 16:27:16.657184 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"090fd341-e9a5-4d12-8aae-271b6b421647","Type":"ContainerStarted","Data":"378d2161c3754906408b75b3519179480935ac23e4ae14a71206e73596f5813e"} Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.003140 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="eeabf008-f00e-4394-a691-affd62647290" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.79:8776/healthcheck\": read tcp 10.217.0.2:47504->10.217.1.79:8776: read: connection reset by peer" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.324427 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.461566 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-combined-ca-bundle\") pod \"eeabf008-f00e-4394-a691-affd62647290\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.461627 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeabf008-f00e-4394-a691-affd62647290-logs\") pod \"eeabf008-f00e-4394-a691-affd62647290\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.461671 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data-custom\") pod \"eeabf008-f00e-4394-a691-affd62647290\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.461730 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data\") pod \"eeabf008-f00e-4394-a691-affd62647290\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.461757 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mljvv\" (UniqueName: \"kubernetes.io/projected/eeabf008-f00e-4394-a691-affd62647290-kube-api-access-mljvv\") pod \"eeabf008-f00e-4394-a691-affd62647290\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.461790 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-scripts\") pod \"eeabf008-f00e-4394-a691-affd62647290\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.461810 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eeabf008-f00e-4394-a691-affd62647290-etc-machine-id\") pod \"eeabf008-f00e-4394-a691-affd62647290\" (UID: \"eeabf008-f00e-4394-a691-affd62647290\") " Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.462215 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eeabf008-f00e-4394-a691-affd62647290-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eeabf008-f00e-4394-a691-affd62647290" (UID: "eeabf008-f00e-4394-a691-affd62647290"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.462354 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeabf008-f00e-4394-a691-affd62647290-logs" (OuterVolumeSpecName: "logs") pod "eeabf008-f00e-4394-a691-affd62647290" (UID: "eeabf008-f00e-4394-a691-affd62647290"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.467917 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeabf008-f00e-4394-a691-affd62647290-kube-api-access-mljvv" (OuterVolumeSpecName: "kube-api-access-mljvv") pod "eeabf008-f00e-4394-a691-affd62647290" (UID: "eeabf008-f00e-4394-a691-affd62647290"). InnerVolumeSpecName "kube-api-access-mljvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.468032 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-scripts" (OuterVolumeSpecName: "scripts") pod "eeabf008-f00e-4394-a691-affd62647290" (UID: "eeabf008-f00e-4394-a691-affd62647290"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.471147 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eeabf008-f00e-4394-a691-affd62647290" (UID: "eeabf008-f00e-4394-a691-affd62647290"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.493989 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eeabf008-f00e-4394-a691-affd62647290" (UID: "eeabf008-f00e-4394-a691-affd62647290"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.524164 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data" (OuterVolumeSpecName: "config-data") pod "eeabf008-f00e-4394-a691-affd62647290" (UID: "eeabf008-f00e-4394-a691-affd62647290"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.563519 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.563555 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.563567 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mljvv\" (UniqueName: \"kubernetes.io/projected/eeabf008-f00e-4394-a691-affd62647290-kube-api-access-mljvv\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.563581 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.563591 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eeabf008-f00e-4394-a691-affd62647290-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.563601 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeabf008-f00e-4394-a691-affd62647290-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.563629 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeabf008-f00e-4394-a691-affd62647290-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.630235 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.668748 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"91b0955f-bf72-48f7-86de-a50ce7701fc7","Type":"ContainerStarted","Data":"76752802411cf47a260711774abba90e8bd9c0fa54b6326898df68fefcaca956"} Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.677514 4763 generic.go:334] "Generic (PLEG): container finished" podID="eeabf008-f00e-4394-a691-affd62647290" containerID="dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d" exitCode=0 Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.677654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eeabf008-f00e-4394-a691-affd62647290","Type":"ContainerDied","Data":"dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d"} Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.677689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eeabf008-f00e-4394-a691-affd62647290","Type":"ContainerDied","Data":"b1869bdf447703fc95201d8e47f662cd1839d798b79a09e08ba8f9e2d4795a84"} Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.677709 4763 scope.go:117] "RemoveContainer" containerID="dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.678146 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.680933 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"090fd341-e9a5-4d12-8aae-271b6b421647","Type":"ContainerStarted","Data":"420a3f21bd62242b09ee0e72fb3f11260ab7ea6ee6962fa78bceea65ba32d914"} Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.680988 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"090fd341-e9a5-4d12-8aae-271b6b421647","Type":"ContainerStarted","Data":"5ec5f764a5f39a20a8030503dcfd4ff403075fafe96654237e5da70c61d6a330"} Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.701833 4763 scope.go:117] "RemoveContainer" containerID="35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.715883 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.90688713 podStartE2EDuration="3.715864034s" podCreationTimestamp="2025-10-06 16:27:14 +0000 UTC" firstStartedPulling="2025-10-06 16:27:15.276110657 +0000 UTC m=+5632.431403159" lastFinishedPulling="2025-10-06 16:27:16.085087551 +0000 UTC m=+5633.240380063" observedRunningTime="2025-10-06 16:27:17.70191211 +0000 UTC m=+5634.857204632" watchObservedRunningTime="2025-10-06 16:27:17.715864034 +0000 UTC m=+5634.871156566" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.717937 4763 scope.go:117] "RemoveContainer" containerID="dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d" Oct 06 16:27:17 crc kubenswrapper[4763]: E1006 16:27:17.718324 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d\": container with ID starting with dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d not found: ID does not exist" containerID="dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.718372 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d"} err="failed to get container status \"dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d\": rpc error: code = NotFound desc = could not find container \"dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d\": container with ID starting with dffe035c5ee82369a141b0a25aa2700694dfe72af123d97fad26948a1263cf2d not found: ID does not exist" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.718406 4763 scope.go:117] "RemoveContainer" containerID="35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76" Oct 06 16:27:17 crc kubenswrapper[4763]: E1006 16:27:17.718817 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76\": container with ID starting with 35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76 not found: ID does not exist" containerID="35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.718848 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76"} err="failed to get container status \"35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76\": rpc error: code = NotFound desc = could not find container \"35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76\": container with ID starting with 35430aceb3e38e66a9582f68b7dec4401eb8cfaa275d1151a5189ea42e2b9a76 not found: ID does not exist" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.736779 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.848852464 podStartE2EDuration="3.736758894s" podCreationTimestamp="2025-10-06 16:27:14 +0000 UTC" firstStartedPulling="2025-10-06 16:27:15.812334243 +0000 UTC m=+5632.967626755" lastFinishedPulling="2025-10-06 16:27:16.700240683 +0000 UTC m=+5633.855533185" observedRunningTime="2025-10-06 16:27:17.733686102 +0000 UTC m=+5634.888978634" watchObservedRunningTime="2025-10-06 16:27:17.736758894 +0000 UTC m=+5634.892051406" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.753504 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.762811 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.787207 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 16:27:17 crc kubenswrapper[4763]: E1006 16:27:17.787809 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeabf008-f00e-4394-a691-affd62647290" containerName="cinder-api-log" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.787836 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeabf008-f00e-4394-a691-affd62647290" containerName="cinder-api-log" Oct 06 16:27:17 crc kubenswrapper[4763]: E1006 16:27:17.787858 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeabf008-f00e-4394-a691-affd62647290" containerName="cinder-api" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.787867 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeabf008-f00e-4394-a691-affd62647290" containerName="cinder-api" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.788101 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeabf008-f00e-4394-a691-affd62647290" containerName="cinder-api" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.788131 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeabf008-f00e-4394-a691-affd62647290" containerName="cinder-api-log" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.789357 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.791412 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.817341 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.838360 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.838955 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.845104 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.845180 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.871808 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.871855 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.871882 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8jr5\" (UniqueName: \"kubernetes.io/projected/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-kube-api-access-d8jr5\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.871955 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-logs\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.872069 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-config-data\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.872109 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-scripts\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.872134 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-config-data-custom\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.882264 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.882698 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.884492 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.885916 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.973449 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-config-data\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.974017 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-scripts\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.974101 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-config-data-custom\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.974334 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.974369 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.974389 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8jr5\" (UniqueName: \"kubernetes.io/projected/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-kube-api-access-d8jr5\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.974501 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-logs\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.975442 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-logs\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.975489 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.980937 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.985217 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-scripts\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.985600 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-config-data\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.988784 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-config-data-custom\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:17 crc kubenswrapper[4763]: I1006 16:27:17.992069 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8jr5\" (UniqueName: \"kubernetes.io/projected/03ffa1c1-5baf-4f5f-8120-a9b17af29abe-kube-api-access-d8jr5\") pod \"cinder-api-0\" (UID: \"03ffa1c1-5baf-4f5f-8120-a9b17af29abe\") " pod="openstack/cinder-api-0" Oct 06 16:27:18 crc kubenswrapper[4763]: I1006 16:27:18.114444 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 16:27:18 crc kubenswrapper[4763]: I1006 16:27:18.588703 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 16:27:18 crc kubenswrapper[4763]: W1006 16:27:18.598199 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03ffa1c1_5baf_4f5f_8120_a9b17af29abe.slice/crio-6573dc882c1711069ae3b3b6c072e4db0041d61ed157945e5e485b751f9174ba WatchSource:0}: Error finding container 6573dc882c1711069ae3b3b6c072e4db0041d61ed157945e5e485b751f9174ba: Status 404 returned error can't find the container with id 6573dc882c1711069ae3b3b6c072e4db0041d61ed157945e5e485b751f9174ba Oct 06 16:27:18 crc kubenswrapper[4763]: I1006 16:27:18.698160 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"03ffa1c1-5baf-4f5f-8120-a9b17af29abe","Type":"ContainerStarted","Data":"6573dc882c1711069ae3b3b6c072e4db0041d61ed157945e5e485b751f9174ba"} Oct 06 16:27:18 crc kubenswrapper[4763]: I1006 16:27:18.701269 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 16:27:18 crc kubenswrapper[4763]: I1006 16:27:18.704788 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 16:27:19 crc kubenswrapper[4763]: I1006 16:27:19.585106 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeabf008-f00e-4394-a691-affd62647290" path="/var/lib/kubelet/pods/eeabf008-f00e-4394-a691-affd62647290/volumes" Oct 06 16:27:19 crc kubenswrapper[4763]: I1006 16:27:19.675706 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:19 crc kubenswrapper[4763]: I1006 16:27:19.712179 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"03ffa1c1-5baf-4f5f-8120-a9b17af29abe","Type":"ContainerStarted","Data":"84fbf3dc370bf5a488c5dca1e6f3c5d9b28b23bb84a087d24f9deaddd70e2a46"} Oct 06 16:27:20 crc kubenswrapper[4763]: I1006 16:27:20.242576 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 06 16:27:20 crc kubenswrapper[4763]: I1006 16:27:20.730274 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"03ffa1c1-5baf-4f5f-8120-a9b17af29abe","Type":"ContainerStarted","Data":"6045a9280a216dada691bfaf0849db82727d2a4da385bed51f5adbf3def9f5b9"} Oct 06 16:27:20 crc kubenswrapper[4763]: I1006 16:27:20.762309 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.762278096 podStartE2EDuration="3.762278096s" podCreationTimestamp="2025-10-06 16:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:27:20.752676308 +0000 UTC m=+5637.907968850" watchObservedRunningTime="2025-10-06 16:27:20.762278096 +0000 UTC m=+5637.917570648" Oct 06 16:27:21 crc kubenswrapper[4763]: I1006 16:27:21.739970 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 16:27:22 crc kubenswrapper[4763]: I1006 16:27:22.811043 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 16:27:22 crc kubenswrapper[4763]: I1006 16:27:22.899134 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 16:27:23 crc kubenswrapper[4763]: I1006 16:27:23.762222 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e163584a-58e6-44c2-a5d2-a977558f6eff" containerName="cinder-scheduler" containerID="cri-o://42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743" gracePeriod=30 Oct 06 16:27:23 crc kubenswrapper[4763]: I1006 16:27:23.762418 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e163584a-58e6-44c2-a5d2-a977558f6eff" containerName="probe" containerID="cri-o://67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f" gracePeriod=30 Oct 06 16:27:23 crc kubenswrapper[4763]: I1006 16:27:23.838754 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r8qgj"] Oct 06 16:27:23 crc kubenswrapper[4763]: I1006 16:27:23.840786 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:23 crc kubenswrapper[4763]: I1006 16:27:23.847538 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8qgj"] Oct 06 16:27:23 crc kubenswrapper[4763]: I1006 16:27:23.996881 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc7b8\" (UniqueName: \"kubernetes.io/projected/7069b7df-3bbb-4606-b4a2-811fb090e8dc-kube-api-access-qc7b8\") pod \"certified-operators-r8qgj\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:23 crc kubenswrapper[4763]: I1006 16:27:23.996935 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-utilities\") pod \"certified-operators-r8qgj\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:23 crc kubenswrapper[4763]: I1006 16:27:23.997078 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-catalog-content\") pod \"certified-operators-r8qgj\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.098683 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc7b8\" (UniqueName: \"kubernetes.io/projected/7069b7df-3bbb-4606-b4a2-811fb090e8dc-kube-api-access-qc7b8\") pod \"certified-operators-r8qgj\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.098737 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-utilities\") pod \"certified-operators-r8qgj\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.098778 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-catalog-content\") pod \"certified-operators-r8qgj\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.099294 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-catalog-content\") pod \"certified-operators-r8qgj\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.099801 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-utilities\") pod \"certified-operators-r8qgj\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.123231 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc7b8\" (UniqueName: \"kubernetes.io/projected/7069b7df-3bbb-4606-b4a2-811fb090e8dc-kube-api-access-qc7b8\") pod \"certified-operators-r8qgj\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.165193 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.555007 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8qgj"] Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.777304 4763 generic.go:334] "Generic (PLEG): container finished" podID="e163584a-58e6-44c2-a5d2-a977558f6eff" containerID="67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f" exitCode=0 Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.777404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e163584a-58e6-44c2-a5d2-a977558f6eff","Type":"ContainerDied","Data":"67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f"} Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.783004 4763 generic.go:334] "Generic (PLEG): container finished" podID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerID="14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9" exitCode=0 Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.783045 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8qgj" event={"ID":"7069b7df-3bbb-4606-b4a2-811fb090e8dc","Type":"ContainerDied","Data":"14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9"} Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.783075 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8qgj" event={"ID":"7069b7df-3bbb-4606-b4a2-811fb090e8dc","Type":"ContainerStarted","Data":"6f8bbb235eda64fb4856694f3d755e20eaf94daa8e58d393992068205f7725e3"} Oct 06 16:27:24 crc kubenswrapper[4763]: I1006 16:27:24.914816 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.293189 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.438280 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data\") pod \"e163584a-58e6-44c2-a5d2-a977558f6eff\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.438473 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e163584a-58e6-44c2-a5d2-a977558f6eff-etc-machine-id\") pod \"e163584a-58e6-44c2-a5d2-a977558f6eff\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.438536 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data-custom\") pod \"e163584a-58e6-44c2-a5d2-a977558f6eff\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.438577 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-combined-ca-bundle\") pod \"e163584a-58e6-44c2-a5d2-a977558f6eff\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.438608 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bswxh\" (UniqueName: \"kubernetes.io/projected/e163584a-58e6-44c2-a5d2-a977558f6eff-kube-api-access-bswxh\") pod \"e163584a-58e6-44c2-a5d2-a977558f6eff\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.438686 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-scripts\") pod \"e163584a-58e6-44c2-a5d2-a977558f6eff\" (UID: \"e163584a-58e6-44c2-a5d2-a977558f6eff\") " Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.439725 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e163584a-58e6-44c2-a5d2-a977558f6eff-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e163584a-58e6-44c2-a5d2-a977558f6eff" (UID: "e163584a-58e6-44c2-a5d2-a977558f6eff"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.446734 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e163584a-58e6-44c2-a5d2-a977558f6eff" (UID: "e163584a-58e6-44c2-a5d2-a977558f6eff"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.452338 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e163584a-58e6-44c2-a5d2-a977558f6eff-kube-api-access-bswxh" (OuterVolumeSpecName: "kube-api-access-bswxh") pod "e163584a-58e6-44c2-a5d2-a977558f6eff" (UID: "e163584a-58e6-44c2-a5d2-a977558f6eff"). InnerVolumeSpecName "kube-api-access-bswxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.452509 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-scripts" (OuterVolumeSpecName: "scripts") pod "e163584a-58e6-44c2-a5d2-a977558f6eff" (UID: "e163584a-58e6-44c2-a5d2-a977558f6eff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.503578 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.513722 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e163584a-58e6-44c2-a5d2-a977558f6eff" (UID: "e163584a-58e6-44c2-a5d2-a977558f6eff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.541485 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e163584a-58e6-44c2-a5d2-a977558f6eff-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.541516 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.541526 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.541535 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bswxh\" (UniqueName: \"kubernetes.io/projected/e163584a-58e6-44c2-a5d2-a977558f6eff-kube-api-access-bswxh\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.541546 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.584147 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data" (OuterVolumeSpecName: "config-data") pod "e163584a-58e6-44c2-a5d2-a977558f6eff" (UID: "e163584a-58e6-44c2-a5d2-a977558f6eff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.643924 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e163584a-58e6-44c2-a5d2-a977558f6eff-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.791735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8qgj" event={"ID":"7069b7df-3bbb-4606-b4a2-811fb090e8dc","Type":"ContainerStarted","Data":"c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d"} Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.794629 4763 generic.go:334] "Generic (PLEG): container finished" podID="e163584a-58e6-44c2-a5d2-a977558f6eff" containerID="42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743" exitCode=0 Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.794658 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e163584a-58e6-44c2-a5d2-a977558f6eff","Type":"ContainerDied","Data":"42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743"} Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.794675 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e163584a-58e6-44c2-a5d2-a977558f6eff","Type":"ContainerDied","Data":"da0eff046a37132938b0e9fab281fe174f8f8aaefe19f80826d6e7563aada0a4"} Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.794690 4763 scope.go:117] "RemoveContainer" containerID="67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.794780 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.820399 4763 scope.go:117] "RemoveContainer" containerID="42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.837737 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.845558 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.859369 4763 scope.go:117] "RemoveContainer" containerID="67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.861516 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 16:27:25 crc kubenswrapper[4763]: E1006 16:27:25.861744 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f\": container with ID starting with 67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f not found: ID does not exist" containerID="67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.861880 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f"} err="failed to get container status \"67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f\": rpc error: code = NotFound desc = could not find container \"67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f\": container with ID starting with 67f01ac3dcd2cfdeec697b62625025e297d68dbb5a0366fb05006503a410af0f not found: ID does not exist" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.862024 4763 scope.go:117] "RemoveContainer" containerID="42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743" Oct 06 16:27:25 crc kubenswrapper[4763]: E1006 16:27:25.862290 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e163584a-58e6-44c2-a5d2-a977558f6eff" containerName="probe" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.866784 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e163584a-58e6-44c2-a5d2-a977558f6eff" containerName="probe" Oct 06 16:27:25 crc kubenswrapper[4763]: E1006 16:27:25.866884 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e163584a-58e6-44c2-a5d2-a977558f6eff" containerName="cinder-scheduler" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.866894 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e163584a-58e6-44c2-a5d2-a977558f6eff" containerName="cinder-scheduler" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.867308 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e163584a-58e6-44c2-a5d2-a977558f6eff" containerName="cinder-scheduler" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.867357 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e163584a-58e6-44c2-a5d2-a977558f6eff" containerName="probe" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.868541 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 16:27:25 crc kubenswrapper[4763]: E1006 16:27:25.869585 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743\": container with ID starting with 42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743 not found: ID does not exist" containerID="42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.869655 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743"} err="failed to get container status \"42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743\": rpc error: code = NotFound desc = could not find container \"42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743\": container with ID starting with 42c4faf72647cec15860eb76c10f897e09a48534dc3292d208f6de936044b743 not found: ID does not exist" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.871856 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.873467 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.949005 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-config-data\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.949095 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmrmb\" (UniqueName: \"kubernetes.io/projected/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-kube-api-access-wmrmb\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.949237 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.949257 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.949278 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-scripts\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:25 crc kubenswrapper[4763]: I1006 16:27:25.949332 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.051209 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.051287 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-scripts\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.051385 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.051467 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-config-data\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.051538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmrmb\" (UniqueName: \"kubernetes.io/projected/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-kube-api-access-wmrmb\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.051467 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.051781 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.055890 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.056087 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.057000 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-config-data\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.058206 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-scripts\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.069105 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmrmb\" (UniqueName: \"kubernetes.io/projected/f0720e2f-6b06-46d1-beba-86d1e81f9f9b-kube-api-access-wmrmb\") pod \"cinder-scheduler-0\" (UID: \"f0720e2f-6b06-46d1-beba-86d1e81f9f9b\") " pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.240642 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.788642 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.807510 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f0720e2f-6b06-46d1-beba-86d1e81f9f9b","Type":"ContainerStarted","Data":"5029a58d6151e9a72120f96844de6af9c8beb81d8a3190edfe020f70bb598970"} Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.810385 4763 generic.go:334] "Generic (PLEG): container finished" podID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerID="c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d" exitCode=0 Oct 06 16:27:26 crc kubenswrapper[4763]: I1006 16:27:26.810463 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8qgj" event={"ID":"7069b7df-3bbb-4606-b4a2-811fb090e8dc","Type":"ContainerDied","Data":"c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d"} Oct 06 16:27:27 crc kubenswrapper[4763]: I1006 16:27:27.596331 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e163584a-58e6-44c2-a5d2-a977558f6eff" path="/var/lib/kubelet/pods/e163584a-58e6-44c2-a5d2-a977558f6eff/volumes" Oct 06 16:27:27 crc kubenswrapper[4763]: I1006 16:27:27.841979 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f0720e2f-6b06-46d1-beba-86d1e81f9f9b","Type":"ContainerStarted","Data":"50450fa2f0c7e475d38b3faf23ca8afbfdf29721f2a216e965a34f3bb37e95e1"} Oct 06 16:27:27 crc kubenswrapper[4763]: I1006 16:27:27.845350 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8qgj" event={"ID":"7069b7df-3bbb-4606-b4a2-811fb090e8dc","Type":"ContainerStarted","Data":"baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f"} Oct 06 16:27:27 crc kubenswrapper[4763]: I1006 16:27:27.871426 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r8qgj" podStartSLOduration=2.313471683 podStartE2EDuration="4.871407276s" podCreationTimestamp="2025-10-06 16:27:23 +0000 UTC" firstStartedPulling="2025-10-06 16:27:24.784885071 +0000 UTC m=+5641.940177603" lastFinishedPulling="2025-10-06 16:27:27.342820674 +0000 UTC m=+5644.498113196" observedRunningTime="2025-10-06 16:27:27.870774979 +0000 UTC m=+5645.026067501" watchObservedRunningTime="2025-10-06 16:27:27.871407276 +0000 UTC m=+5645.026699788" Oct 06 16:27:28 crc kubenswrapper[4763]: I1006 16:27:28.860501 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f0720e2f-6b06-46d1-beba-86d1e81f9f9b","Type":"ContainerStarted","Data":"61a76b06c063cb60cb405745c21935e085710d979aed2b55354988e5a7665ed4"} Oct 06 16:27:28 crc kubenswrapper[4763]: I1006 16:27:28.878788 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.878770087 podStartE2EDuration="3.878770087s" podCreationTimestamp="2025-10-06 16:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:27:28.876852695 +0000 UTC m=+5646.032145247" watchObservedRunningTime="2025-10-06 16:27:28.878770087 +0000 UTC m=+5646.034062599" Oct 06 16:27:29 crc kubenswrapper[4763]: I1006 16:27:29.782437 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.196975 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qlrxm"] Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.200105 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.219438 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qlrxm"] Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.337162 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-catalog-content\") pod \"redhat-operators-qlrxm\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.337689 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v74q\" (UniqueName: \"kubernetes.io/projected/bcfbe67a-c04a-4b51-a46d-af7d20e50541-kube-api-access-7v74q\") pod \"redhat-operators-qlrxm\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.337736 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-utilities\") pod \"redhat-operators-qlrxm\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.438929 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-catalog-content\") pod \"redhat-operators-qlrxm\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.439001 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v74q\" (UniqueName: \"kubernetes.io/projected/bcfbe67a-c04a-4b51-a46d-af7d20e50541-kube-api-access-7v74q\") pod \"redhat-operators-qlrxm\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.439041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-utilities\") pod \"redhat-operators-qlrxm\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.439556 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-utilities\") pod \"redhat-operators-qlrxm\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.439826 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-catalog-content\") pod \"redhat-operators-qlrxm\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.469451 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v74q\" (UniqueName: \"kubernetes.io/projected/bcfbe67a-c04a-4b51-a46d-af7d20e50541-kube-api-access-7v74q\") pod \"redhat-operators-qlrxm\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:30 crc kubenswrapper[4763]: I1006 16:27:30.524533 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:31 crc kubenswrapper[4763]: I1006 16:27:31.028687 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qlrxm"] Oct 06 16:27:31 crc kubenswrapper[4763]: W1006 16:27:31.032044 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcfbe67a_c04a_4b51_a46d_af7d20e50541.slice/crio-27a0f53d25eb4aef1af1e9b96736b3a1b8e346e6bbd349c4b34e29091b84d19e WatchSource:0}: Error finding container 27a0f53d25eb4aef1af1e9b96736b3a1b8e346e6bbd349c4b34e29091b84d19e: Status 404 returned error can't find the container with id 27a0f53d25eb4aef1af1e9b96736b3a1b8e346e6bbd349c4b34e29091b84d19e Oct 06 16:27:31 crc kubenswrapper[4763]: I1006 16:27:31.241167 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 16:27:31 crc kubenswrapper[4763]: I1006 16:27:31.915120 4763 generic.go:334] "Generic (PLEG): container finished" podID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerID="3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9" exitCode=0 Oct 06 16:27:31 crc kubenswrapper[4763]: I1006 16:27:31.915166 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlrxm" event={"ID":"bcfbe67a-c04a-4b51-a46d-af7d20e50541","Type":"ContainerDied","Data":"3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9"} Oct 06 16:27:31 crc kubenswrapper[4763]: I1006 16:27:31.915189 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlrxm" event={"ID":"bcfbe67a-c04a-4b51-a46d-af7d20e50541","Type":"ContainerStarted","Data":"27a0f53d25eb4aef1af1e9b96736b3a1b8e346e6bbd349c4b34e29091b84d19e"} Oct 06 16:27:32 crc kubenswrapper[4763]: I1006 16:27:32.927246 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlrxm" event={"ID":"bcfbe67a-c04a-4b51-a46d-af7d20e50541","Type":"ContainerStarted","Data":"be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8"} Oct 06 16:27:34 crc kubenswrapper[4763]: I1006 16:27:34.166144 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:34 crc kubenswrapper[4763]: I1006 16:27:34.166527 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:34 crc kubenswrapper[4763]: I1006 16:27:34.232170 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:34 crc kubenswrapper[4763]: I1006 16:27:34.962360 4763 generic.go:334] "Generic (PLEG): container finished" podID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerID="be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8" exitCode=0 Oct 06 16:27:34 crc kubenswrapper[4763]: I1006 16:27:34.962445 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlrxm" event={"ID":"bcfbe67a-c04a-4b51-a46d-af7d20e50541","Type":"ContainerDied","Data":"be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8"} Oct 06 16:27:35 crc kubenswrapper[4763]: I1006 16:27:35.028602 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:35 crc kubenswrapper[4763]: I1006 16:27:35.599662 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8qgj"] Oct 06 16:27:35 crc kubenswrapper[4763]: I1006 16:27:35.976864 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlrxm" event={"ID":"bcfbe67a-c04a-4b51-a46d-af7d20e50541","Type":"ContainerStarted","Data":"ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2"} Oct 06 16:27:36 crc kubenswrapper[4763]: I1006 16:27:36.008356 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qlrxm" podStartSLOduration=2.4808063750000002 podStartE2EDuration="6.008335117s" podCreationTimestamp="2025-10-06 16:27:30 +0000 UTC" firstStartedPulling="2025-10-06 16:27:31.916908686 +0000 UTC m=+5649.072201198" lastFinishedPulling="2025-10-06 16:27:35.444437388 +0000 UTC m=+5652.599729940" observedRunningTime="2025-10-06 16:27:36.000069915 +0000 UTC m=+5653.155362457" watchObservedRunningTime="2025-10-06 16:27:36.008335117 +0000 UTC m=+5653.163627639" Oct 06 16:27:36 crc kubenswrapper[4763]: I1006 16:27:36.481121 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 16:27:36 crc kubenswrapper[4763]: I1006 16:27:36.985412 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r8qgj" podUID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerName="registry-server" containerID="cri-o://baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f" gracePeriod=2 Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.479550 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.576044 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc7b8\" (UniqueName: \"kubernetes.io/projected/7069b7df-3bbb-4606-b4a2-811fb090e8dc-kube-api-access-qc7b8\") pod \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.576312 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-utilities\") pod \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.576402 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-catalog-content\") pod \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\" (UID: \"7069b7df-3bbb-4606-b4a2-811fb090e8dc\") " Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.578278 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-utilities" (OuterVolumeSpecName: "utilities") pod "7069b7df-3bbb-4606-b4a2-811fb090e8dc" (UID: "7069b7df-3bbb-4606-b4a2-811fb090e8dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.590049 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7069b7df-3bbb-4606-b4a2-811fb090e8dc-kube-api-access-qc7b8" (OuterVolumeSpecName: "kube-api-access-qc7b8") pod "7069b7df-3bbb-4606-b4a2-811fb090e8dc" (UID: "7069b7df-3bbb-4606-b4a2-811fb090e8dc"). InnerVolumeSpecName "kube-api-access-qc7b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.626592 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7069b7df-3bbb-4606-b4a2-811fb090e8dc" (UID: "7069b7df-3bbb-4606-b4a2-811fb090e8dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.680162 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.680229 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7069b7df-3bbb-4606-b4a2-811fb090e8dc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.680243 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc7b8\" (UniqueName: \"kubernetes.io/projected/7069b7df-3bbb-4606-b4a2-811fb090e8dc-kube-api-access-qc7b8\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.997455 4763 generic.go:334] "Generic (PLEG): container finished" podID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerID="baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f" exitCode=0 Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.997519 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8qgj" event={"ID":"7069b7df-3bbb-4606-b4a2-811fb090e8dc","Type":"ContainerDied","Data":"baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f"} Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.997550 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8qgj" event={"ID":"7069b7df-3bbb-4606-b4a2-811fb090e8dc","Type":"ContainerDied","Data":"6f8bbb235eda64fb4856694f3d755e20eaf94daa8e58d393992068205f7725e3"} Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.997570 4763 scope.go:117] "RemoveContainer" containerID="baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f" Oct 06 16:27:37 crc kubenswrapper[4763]: I1006 16:27:37.997598 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8qgj" Oct 06 16:27:38 crc kubenswrapper[4763]: I1006 16:27:38.034520 4763 scope.go:117] "RemoveContainer" containerID="c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d" Oct 06 16:27:38 crc kubenswrapper[4763]: I1006 16:27:38.048846 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8qgj"] Oct 06 16:27:38 crc kubenswrapper[4763]: I1006 16:27:38.059787 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r8qgj"] Oct 06 16:27:38 crc kubenswrapper[4763]: I1006 16:27:38.067072 4763 scope.go:117] "RemoveContainer" containerID="14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9" Oct 06 16:27:38 crc kubenswrapper[4763]: I1006 16:27:38.103643 4763 scope.go:117] "RemoveContainer" containerID="baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f" Oct 06 16:27:38 crc kubenswrapper[4763]: E1006 16:27:38.105226 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f\": container with ID starting with baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f not found: ID does not exist" containerID="baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f" Oct 06 16:27:38 crc kubenswrapper[4763]: I1006 16:27:38.105268 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f"} err="failed to get container status \"baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f\": rpc error: code = NotFound desc = could not find container \"baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f\": container with ID starting with baeabaa20fb1cab031e9f317a73ec01eeb1fa862e0da8f445e198b27d4116c0f not found: ID does not exist" Oct 06 16:27:38 crc kubenswrapper[4763]: I1006 16:27:38.105296 4763 scope.go:117] "RemoveContainer" containerID="c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d" Oct 06 16:27:38 crc kubenswrapper[4763]: E1006 16:27:38.105771 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d\": container with ID starting with c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d not found: ID does not exist" containerID="c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d" Oct 06 16:27:38 crc kubenswrapper[4763]: I1006 16:27:38.105799 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d"} err="failed to get container status \"c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d\": rpc error: code = NotFound desc = could not find container \"c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d\": container with ID starting with c2c08a4d282af4f4a59520afdeabc9c846908fc059d054d81d235d2dbe33af6d not found: ID does not exist" Oct 06 16:27:38 crc kubenswrapper[4763]: I1006 16:27:38.105814 4763 scope.go:117] "RemoveContainer" containerID="14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9" Oct 06 16:27:38 crc kubenswrapper[4763]: E1006 16:27:38.106094 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9\": container with ID starting with 14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9 not found: ID does not exist" containerID="14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9" Oct 06 16:27:38 crc kubenswrapper[4763]: I1006 16:27:38.106122 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9"} err="failed to get container status \"14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9\": rpc error: code = NotFound desc = could not find container \"14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9\": container with ID starting with 14a4b7f51515b56c9bb92300b5c05b33ad52414346e2ad05e56d4ed3ffb108d9 not found: ID does not exist" Oct 06 16:27:39 crc kubenswrapper[4763]: I1006 16:27:39.596192 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" path="/var/lib/kubelet/pods/7069b7df-3bbb-4606-b4a2-811fb090e8dc/volumes" Oct 06 16:27:40 crc kubenswrapper[4763]: I1006 16:27:40.525415 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:40 crc kubenswrapper[4763]: I1006 16:27:40.525938 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:41 crc kubenswrapper[4763]: I1006 16:27:41.580211 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qlrxm" podUID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerName="registry-server" probeResult="failure" output=< Oct 06 16:27:41 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Oct 06 16:27:41 crc kubenswrapper[4763]: > Oct 06 16:27:50 crc kubenswrapper[4763]: I1006 16:27:50.590006 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:50 crc kubenswrapper[4763]: I1006 16:27:50.644879 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:50 crc kubenswrapper[4763]: I1006 16:27:50.846960 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qlrxm"] Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.178546 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qlrxm" podUID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerName="registry-server" containerID="cri-o://ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2" gracePeriod=2 Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.660929 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.695234 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-catalog-content\") pod \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.695598 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v74q\" (UniqueName: \"kubernetes.io/projected/bcfbe67a-c04a-4b51-a46d-af7d20e50541-kube-api-access-7v74q\") pod \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.695740 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-utilities\") pod \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\" (UID: \"bcfbe67a-c04a-4b51-a46d-af7d20e50541\") " Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.696514 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-utilities" (OuterVolumeSpecName: "utilities") pod "bcfbe67a-c04a-4b51-a46d-af7d20e50541" (UID: "bcfbe67a-c04a-4b51-a46d-af7d20e50541"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.702458 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcfbe67a-c04a-4b51-a46d-af7d20e50541-kube-api-access-7v74q" (OuterVolumeSpecName: "kube-api-access-7v74q") pod "bcfbe67a-c04a-4b51-a46d-af7d20e50541" (UID: "bcfbe67a-c04a-4b51-a46d-af7d20e50541"). InnerVolumeSpecName "kube-api-access-7v74q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.784365 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcfbe67a-c04a-4b51-a46d-af7d20e50541" (UID: "bcfbe67a-c04a-4b51-a46d-af7d20e50541"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.798748 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.798783 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcfbe67a-c04a-4b51-a46d-af7d20e50541-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:52 crc kubenswrapper[4763]: I1006 16:27:52.798798 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v74q\" (UniqueName: \"kubernetes.io/projected/bcfbe67a-c04a-4b51-a46d-af7d20e50541-kube-api-access-7v74q\") on node \"crc\" DevicePath \"\"" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.191693 4763 generic.go:334] "Generic (PLEG): container finished" podID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerID="ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2" exitCode=0 Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.191763 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlrxm" event={"ID":"bcfbe67a-c04a-4b51-a46d-af7d20e50541","Type":"ContainerDied","Data":"ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2"} Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.191840 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlrxm" event={"ID":"bcfbe67a-c04a-4b51-a46d-af7d20e50541","Type":"ContainerDied","Data":"27a0f53d25eb4aef1af1e9b96736b3a1b8e346e6bbd349c4b34e29091b84d19e"} Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.191873 4763 scope.go:117] "RemoveContainer" containerID="ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.191892 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qlrxm" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.219363 4763 scope.go:117] "RemoveContainer" containerID="be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.245855 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qlrxm"] Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.256094 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qlrxm"] Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.276846 4763 scope.go:117] "RemoveContainer" containerID="3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.305648 4763 scope.go:117] "RemoveContainer" containerID="ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2" Oct 06 16:27:53 crc kubenswrapper[4763]: E1006 16:27:53.306124 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2\": container with ID starting with ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2 not found: ID does not exist" containerID="ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.306162 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2"} err="failed to get container status \"ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2\": rpc error: code = NotFound desc = could not find container \"ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2\": container with ID starting with ce388089fb9fbee82354edbb7100a816cdfce4fb3fe9c39617968737e248f5e2 not found: ID does not exist" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.306188 4763 scope.go:117] "RemoveContainer" containerID="be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8" Oct 06 16:27:53 crc kubenswrapper[4763]: E1006 16:27:53.306575 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8\": container with ID starting with be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8 not found: ID does not exist" containerID="be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.306604 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8"} err="failed to get container status \"be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8\": rpc error: code = NotFound desc = could not find container \"be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8\": container with ID starting with be7b57fd3bd375c9850e30d56f91c7badf73a03d94da2066e33359868a6861b8 not found: ID does not exist" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.306645 4763 scope.go:117] "RemoveContainer" containerID="3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9" Oct 06 16:27:53 crc kubenswrapper[4763]: E1006 16:27:53.306983 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9\": container with ID starting with 3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9 not found: ID does not exist" containerID="3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.307007 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9"} err="failed to get container status \"3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9\": rpc error: code = NotFound desc = could not find container \"3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9\": container with ID starting with 3a6101ac6aaea04d99ce0b69ee2b860119fb774794ee6627a15601e0bb0424b9 not found: ID does not exist" Oct 06 16:27:53 crc kubenswrapper[4763]: I1006 16:27:53.595025 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" path="/var/lib/kubelet/pods/bcfbe67a-c04a-4b51-a46d-af7d20e50541/volumes" Oct 06 16:27:58 crc kubenswrapper[4763]: I1006 16:27:58.063724 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wgs95"] Oct 06 16:27:58 crc kubenswrapper[4763]: I1006 16:27:58.074720 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wgs95"] Oct 06 16:27:59 crc kubenswrapper[4763]: I1006 16:27:59.587653 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1" path="/var/lib/kubelet/pods/2cf5b135-5bcc-4ef1-ba0a-d79f29e777c1/volumes" Oct 06 16:28:08 crc kubenswrapper[4763]: I1006 16:28:08.035380 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8e6c-account-create-x6w5b"] Oct 06 16:28:08 crc kubenswrapper[4763]: I1006 16:28:08.045606 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8e6c-account-create-x6w5b"] Oct 06 16:28:09 crc kubenswrapper[4763]: I1006 16:28:09.608516 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90efc09-74b5-490d-b52c-a14d53f80fae" path="/var/lib/kubelet/pods/c90efc09-74b5-490d-b52c-a14d53f80fae/volumes" Oct 06 16:28:15 crc kubenswrapper[4763]: I1006 16:28:15.068939 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-s7wn2"] Oct 06 16:28:15 crc kubenswrapper[4763]: I1006 16:28:15.080542 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-s7wn2"] Oct 06 16:28:15 crc kubenswrapper[4763]: I1006 16:28:15.594595 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09bd4233-a8c5-48aa-b586-28cbf7c52d06" path="/var/lib/kubelet/pods/09bd4233-a8c5-48aa-b586-28cbf7c52d06/volumes" Oct 06 16:28:28 crc kubenswrapper[4763]: I1006 16:28:28.034108 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-znxl7"] Oct 06 16:28:28 crc kubenswrapper[4763]: I1006 16:28:28.044180 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-znxl7"] Oct 06 16:28:29 crc kubenswrapper[4763]: I1006 16:28:29.586714 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c967d347-b476-40be-8c24-7088d6ee6630" path="/var/lib/kubelet/pods/c967d347-b476-40be-8c24-7088d6ee6630/volumes" Oct 06 16:28:30 crc kubenswrapper[4763]: I1006 16:28:30.239833 4763 scope.go:117] "RemoveContainer" containerID="3ab2ce31a2ecb94217cb9981df634b19563bea49ecb03e202dbbcd1f98ce06ce" Oct 06 16:28:30 crc kubenswrapper[4763]: I1006 16:28:30.298827 4763 scope.go:117] "RemoveContainer" containerID="4b122b9a4336f1356347c1bba23fa0c9a87c656067e34a512b71cafce3c44a63" Oct 06 16:28:30 crc kubenswrapper[4763]: I1006 16:28:30.336298 4763 scope.go:117] "RemoveContainer" containerID="95836c7d627374295ea72b01fc8df22b5af9de1f2d87cb7f1a4f679b13dd1e82" Oct 06 16:28:30 crc kubenswrapper[4763]: I1006 16:28:30.397421 4763 scope.go:117] "RemoveContainer" containerID="749c7a68ebd2e3de68a7ce7596666947ddc5449c59454de76585158b70182d6f" Oct 06 16:29:03 crc kubenswrapper[4763]: I1006 16:29:03.876900 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:29:03 crc kubenswrapper[4763]: I1006 16:29:03.877771 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.012671 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-q54nb"] Oct 06 16:29:19 crc kubenswrapper[4763]: E1006 16:29:19.013861 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerName="registry-server" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.013883 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerName="registry-server" Oct 06 16:29:19 crc kubenswrapper[4763]: E1006 16:29:19.013897 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerName="extract-utilities" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.013906 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerName="extract-utilities" Oct 06 16:29:19 crc kubenswrapper[4763]: E1006 16:29:19.013930 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerName="extract-content" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.013941 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerName="extract-content" Oct 06 16:29:19 crc kubenswrapper[4763]: E1006 16:29:19.013966 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerName="extract-utilities" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.013974 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerName="extract-utilities" Oct 06 16:29:19 crc kubenswrapper[4763]: E1006 16:29:19.013987 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerName="registry-server" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.013995 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerName="registry-server" Oct 06 16:29:19 crc kubenswrapper[4763]: E1006 16:29:19.014011 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerName="extract-content" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.014040 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerName="extract-content" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.014263 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfbe67a-c04a-4b51-a46d-af7d20e50541" containerName="registry-server" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.014284 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7069b7df-3bbb-4606-b4a2-811fb090e8dc" containerName="registry-server" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.015258 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.016788 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-sn94d" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.017688 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.028571 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-msztv"] Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.030609 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.042032 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q54nb"] Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.077328 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-msztv"] Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.091199 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50642abe-8228-47b7-9690-67d289d195a9-scripts\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.091304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/50642abe-8228-47b7-9690-67d289d195a9-var-run\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.091346 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/50642abe-8228-47b7-9690-67d289d195a9-var-run-ovn\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.091401 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/50642abe-8228-47b7-9690-67d289d195a9-var-log-ovn\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.091501 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4ps\" (UniqueName: \"kubernetes.io/projected/50642abe-8228-47b7-9690-67d289d195a9-kube-api-access-qz4ps\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.193429 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/661e44f6-cd55-465f-beec-5061aa883c44-scripts\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.193488 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-var-lib\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.193529 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4ps\" (UniqueName: \"kubernetes.io/projected/50642abe-8228-47b7-9690-67d289d195a9-kube-api-access-qz4ps\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.193571 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-var-run\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.193627 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50642abe-8228-47b7-9690-67d289d195a9-scripts\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.193861 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/50642abe-8228-47b7-9690-67d289d195a9-var-run\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.193969 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/50642abe-8228-47b7-9690-67d289d195a9-var-run-ovn\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.194094 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-var-log\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.194144 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/50642abe-8228-47b7-9690-67d289d195a9-var-log-ovn\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.194180 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txmbn\" (UniqueName: \"kubernetes.io/projected/661e44f6-cd55-465f-beec-5061aa883c44-kube-api-access-txmbn\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.194223 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/50642abe-8228-47b7-9690-67d289d195a9-var-run\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.194256 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/50642abe-8228-47b7-9690-67d289d195a9-var-run-ovn\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.194285 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/50642abe-8228-47b7-9690-67d289d195a9-var-log-ovn\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.194353 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-etc-ovs\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.196239 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50642abe-8228-47b7-9690-67d289d195a9-scripts\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.212407 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4ps\" (UniqueName: \"kubernetes.io/projected/50642abe-8228-47b7-9690-67d289d195a9-kube-api-access-qz4ps\") pod \"ovn-controller-q54nb\" (UID: \"50642abe-8228-47b7-9690-67d289d195a9\") " pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.295972 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-etc-ovs\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.296404 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/661e44f6-cd55-465f-beec-5061aa883c44-scripts\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.296262 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-etc-ovs\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.296443 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-var-lib\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.296574 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-var-lib\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.296711 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-var-run\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.296880 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-var-log\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.296930 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txmbn\" (UniqueName: \"kubernetes.io/projected/661e44f6-cd55-465f-beec-5061aa883c44-kube-api-access-txmbn\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.297393 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-var-run\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.297478 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/661e44f6-cd55-465f-beec-5061aa883c44-var-log\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.301590 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/661e44f6-cd55-465f-beec-5061aa883c44-scripts\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.315841 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txmbn\" (UniqueName: \"kubernetes.io/projected/661e44f6-cd55-465f-beec-5061aa883c44-kube-api-access-txmbn\") pod \"ovn-controller-ovs-msztv\" (UID: \"661e44f6-cd55-465f-beec-5061aa883c44\") " pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.338845 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q54nb" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.348008 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:19 crc kubenswrapper[4763]: I1006 16:29:19.811968 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q54nb"] Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.157526 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q54nb" event={"ID":"50642abe-8228-47b7-9690-67d289d195a9","Type":"ContainerStarted","Data":"813de436bead9e90a02bf9037e0df7215563523b28b741950c923e4295f596aa"} Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.224098 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-msztv"] Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.578288 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-bjmp2"] Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.580073 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.588809 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.591445 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bjmp2"] Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.731719 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-ovs-rundir\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.731924 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-config\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.732123 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-ovn-rundir\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.732320 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb9pf\" (UniqueName: \"kubernetes.io/projected/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-kube-api-access-kb9pf\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.833845 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-config\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.833920 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-ovn-rundir\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.833969 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb9pf\" (UniqueName: \"kubernetes.io/projected/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-kube-api-access-kb9pf\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.834023 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-ovs-rundir\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.834251 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-ovs-rundir\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.834298 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-ovn-rundir\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.834527 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-config\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.868300 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb9pf\" (UniqueName: \"kubernetes.io/projected/1655bf62-c30d-4f1b-9c19-0fae5ec2a8da-kube-api-access-kb9pf\") pod \"ovn-controller-metrics-bjmp2\" (UID: \"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da\") " pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:20 crc kubenswrapper[4763]: I1006 16:29:20.909670 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bjmp2" Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.173761 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q54nb" event={"ID":"50642abe-8228-47b7-9690-67d289d195a9","Type":"ContainerStarted","Data":"1f5fffc738077bcd7eb74f2a130ba3461d5496224cae24597b856b82a2fc0956"} Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.173997 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-q54nb" Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.180100 4763 generic.go:334] "Generic (PLEG): container finished" podID="661e44f6-cd55-465f-beec-5061aa883c44" containerID="0ca68d695988c87050f8f4469fdecee0ee694c9fc171d26fe3cc47d78ded1c37" exitCode=0 Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.180152 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-msztv" event={"ID":"661e44f6-cd55-465f-beec-5061aa883c44","Type":"ContainerDied","Data":"0ca68d695988c87050f8f4469fdecee0ee694c9fc171d26fe3cc47d78ded1c37"} Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.180183 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-msztv" event={"ID":"661e44f6-cd55-465f-beec-5061aa883c44","Type":"ContainerStarted","Data":"f4d4f4e7d5cff6a38baa043b3681afa5a6b6750ed080c5ab2e28c32ce00ed0f4"} Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.197413 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-q54nb" podStartSLOduration=3.197395857 podStartE2EDuration="3.197395857s" podCreationTimestamp="2025-10-06 16:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:29:21.192200798 +0000 UTC m=+5758.347493320" watchObservedRunningTime="2025-10-06 16:29:21.197395857 +0000 UTC m=+5758.352688369" Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.450830 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bjmp2"] Oct 06 16:29:21 crc kubenswrapper[4763]: W1006 16:29:21.452569 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1655bf62_c30d_4f1b_9c19_0fae5ec2a8da.slice/crio-b01643a97e8f8c9d18019d7990ae0c93710718c59283c691531fc365fb1c530d WatchSource:0}: Error finding container b01643a97e8f8c9d18019d7990ae0c93710718c59283c691531fc365fb1c530d: Status 404 returned error can't find the container with id b01643a97e8f8c9d18019d7990ae0c93710718c59283c691531fc365fb1c530d Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.679898 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-kpc9x"] Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.681484 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-kpc9x" Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.688931 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-kpc9x"] Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.754665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxnhx\" (UniqueName: \"kubernetes.io/projected/0e669be3-799e-4707-85e3-68ae9c66cdaf-kube-api-access-bxnhx\") pod \"octavia-db-create-kpc9x\" (UID: \"0e669be3-799e-4707-85e3-68ae9c66cdaf\") " pod="openstack/octavia-db-create-kpc9x" Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.856715 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxnhx\" (UniqueName: \"kubernetes.io/projected/0e669be3-799e-4707-85e3-68ae9c66cdaf-kube-api-access-bxnhx\") pod \"octavia-db-create-kpc9x\" (UID: \"0e669be3-799e-4707-85e3-68ae9c66cdaf\") " pod="openstack/octavia-db-create-kpc9x" Oct 06 16:29:21 crc kubenswrapper[4763]: I1006 16:29:21.881168 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxnhx\" (UniqueName: \"kubernetes.io/projected/0e669be3-799e-4707-85e3-68ae9c66cdaf-kube-api-access-bxnhx\") pod \"octavia-db-create-kpc9x\" (UID: \"0e669be3-799e-4707-85e3-68ae9c66cdaf\") " pod="openstack/octavia-db-create-kpc9x" Oct 06 16:29:22 crc kubenswrapper[4763]: I1006 16:29:22.004355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-kpc9x" Oct 06 16:29:22 crc kubenswrapper[4763]: I1006 16:29:22.192350 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-msztv" event={"ID":"661e44f6-cd55-465f-beec-5061aa883c44","Type":"ContainerStarted","Data":"6a2ad5fded747e7844d6524eeec529d7f684424e647ba0d00db6a6f1083397ad"} Oct 06 16:29:22 crc kubenswrapper[4763]: I1006 16:29:22.192811 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-msztv" event={"ID":"661e44f6-cd55-465f-beec-5061aa883c44","Type":"ContainerStarted","Data":"51e6e273f29b6b25f059bf8b7f588324927f71c60b83213b2f14400af3a26025"} Oct 06 16:29:22 crc kubenswrapper[4763]: I1006 16:29:22.193077 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:22 crc kubenswrapper[4763]: I1006 16:29:22.199587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bjmp2" event={"ID":"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da","Type":"ContainerStarted","Data":"e40f0cdc9a6531e1fc1c640d13c0fe557c75f920754417810aafcfab7320f6f8"} Oct 06 16:29:22 crc kubenswrapper[4763]: I1006 16:29:22.199684 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bjmp2" event={"ID":"1655bf62-c30d-4f1b-9c19-0fae5ec2a8da","Type":"ContainerStarted","Data":"b01643a97e8f8c9d18019d7990ae0c93710718c59283c691531fc365fb1c530d"} Oct 06 16:29:22 crc kubenswrapper[4763]: I1006 16:29:22.234196 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-msztv" podStartSLOduration=4.234169596 podStartE2EDuration="4.234169596s" podCreationTimestamp="2025-10-06 16:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:29:22.226430818 +0000 UTC m=+5759.381723360" watchObservedRunningTime="2025-10-06 16:29:22.234169596 +0000 UTC m=+5759.389462108" Oct 06 16:29:22 crc kubenswrapper[4763]: I1006 16:29:22.256561 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-bjmp2" podStartSLOduration=2.256533725 podStartE2EDuration="2.256533725s" podCreationTimestamp="2025-10-06 16:29:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:29:22.243564787 +0000 UTC m=+5759.398857299" watchObservedRunningTime="2025-10-06 16:29:22.256533725 +0000 UTC m=+5759.411826247" Oct 06 16:29:22 crc kubenswrapper[4763]: I1006 16:29:22.460247 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-kpc9x"] Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.100901 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lggjb"] Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.103123 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.118517 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lggjb"] Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.186154 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-catalog-content\") pod \"redhat-marketplace-lggjb\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.186411 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-utilities\") pod \"redhat-marketplace-lggjb\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.186522 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhh6x\" (UniqueName: \"kubernetes.io/projected/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-kube-api-access-vhh6x\") pod \"redhat-marketplace-lggjb\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.210831 4763 generic.go:334] "Generic (PLEG): container finished" podID="0e669be3-799e-4707-85e3-68ae9c66cdaf" containerID="746d8d2f42749b85895dc863c90929f148a3214c0fe4ffb999d9f0a94ce9637c" exitCode=0 Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.210919 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-kpc9x" event={"ID":"0e669be3-799e-4707-85e3-68ae9c66cdaf","Type":"ContainerDied","Data":"746d8d2f42749b85895dc863c90929f148a3214c0fe4ffb999d9f0a94ce9637c"} Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.210967 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-kpc9x" event={"ID":"0e669be3-799e-4707-85e3-68ae9c66cdaf","Type":"ContainerStarted","Data":"46c68725a4bce4f04e8b2be35b2e55f95551703adb8001dfa18eb60f03be4a5c"} Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.211889 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.288384 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhh6x\" (UniqueName: \"kubernetes.io/projected/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-kube-api-access-vhh6x\") pod \"redhat-marketplace-lggjb\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.288499 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-catalog-content\") pod \"redhat-marketplace-lggjb\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.288646 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-utilities\") pod \"redhat-marketplace-lggjb\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.289214 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-utilities\") pod \"redhat-marketplace-lggjb\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.289500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-catalog-content\") pod \"redhat-marketplace-lggjb\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.314974 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhh6x\" (UniqueName: \"kubernetes.io/projected/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-kube-api-access-vhh6x\") pod \"redhat-marketplace-lggjb\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.440741 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:23 crc kubenswrapper[4763]: I1006 16:29:23.950156 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lggjb"] Oct 06 16:29:24 crc kubenswrapper[4763]: I1006 16:29:24.220854 4763 generic.go:334] "Generic (PLEG): container finished" podID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerID="27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f" exitCode=0 Oct 06 16:29:24 crc kubenswrapper[4763]: I1006 16:29:24.220932 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lggjb" event={"ID":"cc0d7022-1693-4db9-a5e4-9cde9735b3f1","Type":"ContainerDied","Data":"27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f"} Oct 06 16:29:24 crc kubenswrapper[4763]: I1006 16:29:24.221294 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lggjb" event={"ID":"cc0d7022-1693-4db9-a5e4-9cde9735b3f1","Type":"ContainerStarted","Data":"fbe5b088007b92420523d23f5bffbf291255c536a24a11607f4c8ad0042f58b4"} Oct 06 16:29:24 crc kubenswrapper[4763]: I1006 16:29:24.589139 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-kpc9x" Oct 06 16:29:24 crc kubenswrapper[4763]: I1006 16:29:24.717682 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxnhx\" (UniqueName: \"kubernetes.io/projected/0e669be3-799e-4707-85e3-68ae9c66cdaf-kube-api-access-bxnhx\") pod \"0e669be3-799e-4707-85e3-68ae9c66cdaf\" (UID: \"0e669be3-799e-4707-85e3-68ae9c66cdaf\") " Oct 06 16:29:24 crc kubenswrapper[4763]: I1006 16:29:24.726151 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e669be3-799e-4707-85e3-68ae9c66cdaf-kube-api-access-bxnhx" (OuterVolumeSpecName: "kube-api-access-bxnhx") pod "0e669be3-799e-4707-85e3-68ae9c66cdaf" (UID: "0e669be3-799e-4707-85e3-68ae9c66cdaf"). InnerVolumeSpecName "kube-api-access-bxnhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:29:24 crc kubenswrapper[4763]: I1006 16:29:24.819873 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxnhx\" (UniqueName: \"kubernetes.io/projected/0e669be3-799e-4707-85e3-68ae9c66cdaf-kube-api-access-bxnhx\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:25 crc kubenswrapper[4763]: I1006 16:29:25.233600 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-kpc9x" event={"ID":"0e669be3-799e-4707-85e3-68ae9c66cdaf","Type":"ContainerDied","Data":"46c68725a4bce4f04e8b2be35b2e55f95551703adb8001dfa18eb60f03be4a5c"} Oct 06 16:29:25 crc kubenswrapper[4763]: I1006 16:29:25.233663 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c68725a4bce4f04e8b2be35b2e55f95551703adb8001dfa18eb60f03be4a5c" Oct 06 16:29:25 crc kubenswrapper[4763]: I1006 16:29:25.233713 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-kpc9x" Oct 06 16:29:25 crc kubenswrapper[4763]: I1006 16:29:25.235412 4763 generic.go:334] "Generic (PLEG): container finished" podID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerID="e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7" exitCode=0 Oct 06 16:29:25 crc kubenswrapper[4763]: I1006 16:29:25.235479 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lggjb" event={"ID":"cc0d7022-1693-4db9-a5e4-9cde9735b3f1","Type":"ContainerDied","Data":"e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7"} Oct 06 16:29:26 crc kubenswrapper[4763]: I1006 16:29:26.247714 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lggjb" event={"ID":"cc0d7022-1693-4db9-a5e4-9cde9735b3f1","Type":"ContainerStarted","Data":"2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d"} Oct 06 16:29:26 crc kubenswrapper[4763]: I1006 16:29:26.273514 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lggjb" podStartSLOduration=1.44686484 podStartE2EDuration="3.27349176s" podCreationTimestamp="2025-10-06 16:29:23 +0000 UTC" firstStartedPulling="2025-10-06 16:29:24.223104135 +0000 UTC m=+5761.378396637" lastFinishedPulling="2025-10-06 16:29:26.049731035 +0000 UTC m=+5763.205023557" observedRunningTime="2025-10-06 16:29:26.266620386 +0000 UTC m=+5763.421912938" watchObservedRunningTime="2025-10-06 16:29:26.27349176 +0000 UTC m=+5763.428784292" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.441494 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.442070 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.516495 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.649071 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-452a-account-create-dtrk2"] Oct 06 16:29:33 crc kubenswrapper[4763]: E1006 16:29:33.649640 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e669be3-799e-4707-85e3-68ae9c66cdaf" containerName="mariadb-database-create" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.649664 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e669be3-799e-4707-85e3-68ae9c66cdaf" containerName="mariadb-database-create" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.649889 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e669be3-799e-4707-85e3-68ae9c66cdaf" containerName="mariadb-database-create" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.650559 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-452a-account-create-dtrk2" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.653915 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.661682 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-452a-account-create-dtrk2"] Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.805478 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpjx8\" (UniqueName: \"kubernetes.io/projected/50a671e8-02b3-4112-bc33-dcbf2cbe206a-kube-api-access-tpjx8\") pod \"octavia-452a-account-create-dtrk2\" (UID: \"50a671e8-02b3-4112-bc33-dcbf2cbe206a\") " pod="openstack/octavia-452a-account-create-dtrk2" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.877124 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.877217 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.908160 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpjx8\" (UniqueName: \"kubernetes.io/projected/50a671e8-02b3-4112-bc33-dcbf2cbe206a-kube-api-access-tpjx8\") pod \"octavia-452a-account-create-dtrk2\" (UID: \"50a671e8-02b3-4112-bc33-dcbf2cbe206a\") " pod="openstack/octavia-452a-account-create-dtrk2" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.931147 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpjx8\" (UniqueName: \"kubernetes.io/projected/50a671e8-02b3-4112-bc33-dcbf2cbe206a-kube-api-access-tpjx8\") pod \"octavia-452a-account-create-dtrk2\" (UID: \"50a671e8-02b3-4112-bc33-dcbf2cbe206a\") " pod="openstack/octavia-452a-account-create-dtrk2" Oct 06 16:29:33 crc kubenswrapper[4763]: I1006 16:29:33.982912 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-452a-account-create-dtrk2" Oct 06 16:29:34 crc kubenswrapper[4763]: I1006 16:29:34.383563 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:34 crc kubenswrapper[4763]: I1006 16:29:34.454134 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lggjb"] Oct 06 16:29:34 crc kubenswrapper[4763]: I1006 16:29:34.467568 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-452a-account-create-dtrk2"] Oct 06 16:29:35 crc kubenswrapper[4763]: I1006 16:29:35.334063 4763 generic.go:334] "Generic (PLEG): container finished" podID="50a671e8-02b3-4112-bc33-dcbf2cbe206a" containerID="e796b2dae6f1a255cd2e3b61398559032408ff2e8e713cb8186a90663e4ded42" exitCode=0 Oct 06 16:29:35 crc kubenswrapper[4763]: I1006 16:29:35.334186 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-452a-account-create-dtrk2" event={"ID":"50a671e8-02b3-4112-bc33-dcbf2cbe206a","Type":"ContainerDied","Data":"e796b2dae6f1a255cd2e3b61398559032408ff2e8e713cb8186a90663e4ded42"} Oct 06 16:29:35 crc kubenswrapper[4763]: I1006 16:29:35.334244 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-452a-account-create-dtrk2" event={"ID":"50a671e8-02b3-4112-bc33-dcbf2cbe206a","Type":"ContainerStarted","Data":"2d45ecd39750317b5e1de7b09dfc2c9b35620a37bf18279bbd881f0857503dc3"} Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.348740 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lggjb" podUID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerName="registry-server" containerID="cri-o://2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d" gracePeriod=2 Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.795067 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-452a-account-create-dtrk2" Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.808081 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.873826 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpjx8\" (UniqueName: \"kubernetes.io/projected/50a671e8-02b3-4112-bc33-dcbf2cbe206a-kube-api-access-tpjx8\") pod \"50a671e8-02b3-4112-bc33-dcbf2cbe206a\" (UID: \"50a671e8-02b3-4112-bc33-dcbf2cbe206a\") " Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.883086 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a671e8-02b3-4112-bc33-dcbf2cbe206a-kube-api-access-tpjx8" (OuterVolumeSpecName: "kube-api-access-tpjx8") pod "50a671e8-02b3-4112-bc33-dcbf2cbe206a" (UID: "50a671e8-02b3-4112-bc33-dcbf2cbe206a"). InnerVolumeSpecName "kube-api-access-tpjx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.976472 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhh6x\" (UniqueName: \"kubernetes.io/projected/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-kube-api-access-vhh6x\") pod \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.976550 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-utilities\") pod \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.976975 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-catalog-content\") pod \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\" (UID: \"cc0d7022-1693-4db9-a5e4-9cde9735b3f1\") " Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.977602 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpjx8\" (UniqueName: \"kubernetes.io/projected/50a671e8-02b3-4112-bc33-dcbf2cbe206a-kube-api-access-tpjx8\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.978099 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-utilities" (OuterVolumeSpecName: "utilities") pod "cc0d7022-1693-4db9-a5e4-9cde9735b3f1" (UID: "cc0d7022-1693-4db9-a5e4-9cde9735b3f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.982334 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-kube-api-access-vhh6x" (OuterVolumeSpecName: "kube-api-access-vhh6x") pod "cc0d7022-1693-4db9-a5e4-9cde9735b3f1" (UID: "cc0d7022-1693-4db9-a5e4-9cde9735b3f1"). InnerVolumeSpecName "kube-api-access-vhh6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:29:36 crc kubenswrapper[4763]: I1006 16:29:36.989889 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc0d7022-1693-4db9-a5e4-9cde9735b3f1" (UID: "cc0d7022-1693-4db9-a5e4-9cde9735b3f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.079458 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhh6x\" (UniqueName: \"kubernetes.io/projected/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-kube-api-access-vhh6x\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.079513 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.079527 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0d7022-1693-4db9-a5e4-9cde9735b3f1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.361881 4763 generic.go:334] "Generic (PLEG): container finished" podID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerID="2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d" exitCode=0 Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.361951 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lggjb" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.361968 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lggjb" event={"ID":"cc0d7022-1693-4db9-a5e4-9cde9735b3f1","Type":"ContainerDied","Data":"2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d"} Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.362445 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lggjb" event={"ID":"cc0d7022-1693-4db9-a5e4-9cde9735b3f1","Type":"ContainerDied","Data":"fbe5b088007b92420523d23f5bffbf291255c536a24a11607f4c8ad0042f58b4"} Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.362481 4763 scope.go:117] "RemoveContainer" containerID="2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.364424 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-452a-account-create-dtrk2" event={"ID":"50a671e8-02b3-4112-bc33-dcbf2cbe206a","Type":"ContainerDied","Data":"2d45ecd39750317b5e1de7b09dfc2c9b35620a37bf18279bbd881f0857503dc3"} Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.364453 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d45ecd39750317b5e1de7b09dfc2c9b35620a37bf18279bbd881f0857503dc3" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.364512 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-452a-account-create-dtrk2" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.385846 4763 scope.go:117] "RemoveContainer" containerID="e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.407223 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lggjb"] Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.415962 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lggjb"] Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.416949 4763 scope.go:117] "RemoveContainer" containerID="27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.438108 4763 scope.go:117] "RemoveContainer" containerID="2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d" Oct 06 16:29:37 crc kubenswrapper[4763]: E1006 16:29:37.438587 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d\": container with ID starting with 2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d not found: ID does not exist" containerID="2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.438653 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d"} err="failed to get container status \"2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d\": rpc error: code = NotFound desc = could not find container \"2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d\": container with ID starting with 2ff9c70591bd5d213965fc12d27720dfe6f824688cb734a0a9e7f3391623335d not found: ID does not exist" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.438688 4763 scope.go:117] "RemoveContainer" containerID="e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7" Oct 06 16:29:37 crc kubenswrapper[4763]: E1006 16:29:37.439091 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7\": container with ID starting with e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7 not found: ID does not exist" containerID="e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.439114 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7"} err="failed to get container status \"e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7\": rpc error: code = NotFound desc = could not find container \"e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7\": container with ID starting with e7bae3e606d4e4c7bb0c9a5173ec0f55e28cfd2c3a73dbeab040bd4e4ad2c7c7 not found: ID does not exist" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.439130 4763 scope.go:117] "RemoveContainer" containerID="27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f" Oct 06 16:29:37 crc kubenswrapper[4763]: E1006 16:29:37.439479 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f\": container with ID starting with 27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f not found: ID does not exist" containerID="27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.439515 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f"} err="failed to get container status \"27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f\": rpc error: code = NotFound desc = could not find container \"27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f\": container with ID starting with 27ed9822eed630fc77fdf08eb13b81d78c1a5ba9bffbfd69ca1a9d0d72281e0f not found: ID does not exist" Oct 06 16:29:37 crc kubenswrapper[4763]: I1006 16:29:37.589605 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" path="/var/lib/kubelet/pods/cc0d7022-1693-4db9-a5e4-9cde9735b3f1/volumes" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.252683 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-p26fh"] Oct 06 16:29:40 crc kubenswrapper[4763]: E1006 16:29:40.254977 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerName="extract-utilities" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.255150 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerName="extract-utilities" Oct 06 16:29:40 crc kubenswrapper[4763]: E1006 16:29:40.255296 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a671e8-02b3-4112-bc33-dcbf2cbe206a" containerName="mariadb-account-create" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.255427 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a671e8-02b3-4112-bc33-dcbf2cbe206a" containerName="mariadb-account-create" Oct 06 16:29:40 crc kubenswrapper[4763]: E1006 16:29:40.255569 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerName="registry-server" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.255737 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerName="registry-server" Oct 06 16:29:40 crc kubenswrapper[4763]: E1006 16:29:40.255880 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerName="extract-content" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.255996 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerName="extract-content" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.256471 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a671e8-02b3-4112-bc33-dcbf2cbe206a" containerName="mariadb-account-create" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.256649 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0d7022-1693-4db9-a5e4-9cde9735b3f1" containerName="registry-server" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.257777 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-p26fh" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.266418 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-p26fh"] Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.345521 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpw55\" (UniqueName: \"kubernetes.io/projected/3c5f2bac-bb4b-4d51-b29d-6af33cb541d1-kube-api-access-wpw55\") pod \"octavia-persistence-db-create-p26fh\" (UID: \"3c5f2bac-bb4b-4d51-b29d-6af33cb541d1\") " pod="openstack/octavia-persistence-db-create-p26fh" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.448126 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpw55\" (UniqueName: \"kubernetes.io/projected/3c5f2bac-bb4b-4d51-b29d-6af33cb541d1-kube-api-access-wpw55\") pod \"octavia-persistence-db-create-p26fh\" (UID: \"3c5f2bac-bb4b-4d51-b29d-6af33cb541d1\") " pod="openstack/octavia-persistence-db-create-p26fh" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.484922 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpw55\" (UniqueName: \"kubernetes.io/projected/3c5f2bac-bb4b-4d51-b29d-6af33cb541d1-kube-api-access-wpw55\") pod \"octavia-persistence-db-create-p26fh\" (UID: \"3c5f2bac-bb4b-4d51-b29d-6af33cb541d1\") " pod="openstack/octavia-persistence-db-create-p26fh" Oct 06 16:29:40 crc kubenswrapper[4763]: I1006 16:29:40.579130 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-p26fh" Oct 06 16:29:41 crc kubenswrapper[4763]: I1006 16:29:41.057201 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-p26fh"] Oct 06 16:29:41 crc kubenswrapper[4763]: I1006 16:29:41.403845 4763 generic.go:334] "Generic (PLEG): container finished" podID="3c5f2bac-bb4b-4d51-b29d-6af33cb541d1" containerID="52ce0e843e0439bfe7ae2d2c2ad0a27ec9b548b4e7da5f65ab8a7827e1d03bc7" exitCode=0 Oct 06 16:29:41 crc kubenswrapper[4763]: I1006 16:29:41.403925 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-p26fh" event={"ID":"3c5f2bac-bb4b-4d51-b29d-6af33cb541d1","Type":"ContainerDied","Data":"52ce0e843e0439bfe7ae2d2c2ad0a27ec9b548b4e7da5f65ab8a7827e1d03bc7"} Oct 06 16:29:41 crc kubenswrapper[4763]: I1006 16:29:41.404283 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-p26fh" event={"ID":"3c5f2bac-bb4b-4d51-b29d-6af33cb541d1","Type":"ContainerStarted","Data":"ed9a53d5f98aef6b07c2b0eb81344bbf6d2a1a731c3a97dda315bd4b5199364a"} Oct 06 16:29:42 crc kubenswrapper[4763]: I1006 16:29:42.781946 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-p26fh" Oct 06 16:29:42 crc kubenswrapper[4763]: I1006 16:29:42.896373 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpw55\" (UniqueName: \"kubernetes.io/projected/3c5f2bac-bb4b-4d51-b29d-6af33cb541d1-kube-api-access-wpw55\") pod \"3c5f2bac-bb4b-4d51-b29d-6af33cb541d1\" (UID: \"3c5f2bac-bb4b-4d51-b29d-6af33cb541d1\") " Oct 06 16:29:42 crc kubenswrapper[4763]: I1006 16:29:42.905818 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5f2bac-bb4b-4d51-b29d-6af33cb541d1-kube-api-access-wpw55" (OuterVolumeSpecName: "kube-api-access-wpw55") pod "3c5f2bac-bb4b-4d51-b29d-6af33cb541d1" (UID: "3c5f2bac-bb4b-4d51-b29d-6af33cb541d1"). InnerVolumeSpecName "kube-api-access-wpw55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:29:42 crc kubenswrapper[4763]: I1006 16:29:42.998435 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpw55\" (UniqueName: \"kubernetes.io/projected/3c5f2bac-bb4b-4d51-b29d-6af33cb541d1-kube-api-access-wpw55\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:43 crc kubenswrapper[4763]: I1006 16:29:43.444914 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-p26fh" event={"ID":"3c5f2bac-bb4b-4d51-b29d-6af33cb541d1","Type":"ContainerDied","Data":"ed9a53d5f98aef6b07c2b0eb81344bbf6d2a1a731c3a97dda315bd4b5199364a"} Oct 06 16:29:43 crc kubenswrapper[4763]: I1006 16:29:43.444968 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed9a53d5f98aef6b07c2b0eb81344bbf6d2a1a731c3a97dda315bd4b5199364a" Oct 06 16:29:43 crc kubenswrapper[4763]: I1006 16:29:43.445055 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-p26fh" Oct 06 16:29:51 crc kubenswrapper[4763]: I1006 16:29:51.358440 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-6080-account-create-rrkhw"] Oct 06 16:29:51 crc kubenswrapper[4763]: E1006 16:29:51.359632 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5f2bac-bb4b-4d51-b29d-6af33cb541d1" containerName="mariadb-database-create" Oct 06 16:29:51 crc kubenswrapper[4763]: I1006 16:29:51.359657 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5f2bac-bb4b-4d51-b29d-6af33cb541d1" containerName="mariadb-database-create" Oct 06 16:29:51 crc kubenswrapper[4763]: I1006 16:29:51.359934 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5f2bac-bb4b-4d51-b29d-6af33cb541d1" containerName="mariadb-database-create" Oct 06 16:29:51 crc kubenswrapper[4763]: I1006 16:29:51.360963 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-6080-account-create-rrkhw" Oct 06 16:29:51 crc kubenswrapper[4763]: I1006 16:29:51.363472 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 06 16:29:51 crc kubenswrapper[4763]: I1006 16:29:51.369188 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-6080-account-create-rrkhw"] Oct 06 16:29:51 crc kubenswrapper[4763]: I1006 16:29:51.483304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq8h4\" (UniqueName: \"kubernetes.io/projected/7e2a63b6-8ba6-4805-9a86-477196011cb3-kube-api-access-gq8h4\") pod \"octavia-6080-account-create-rrkhw\" (UID: \"7e2a63b6-8ba6-4805-9a86-477196011cb3\") " pod="openstack/octavia-6080-account-create-rrkhw" Oct 06 16:29:51 crc kubenswrapper[4763]: I1006 16:29:51.587209 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq8h4\" (UniqueName: \"kubernetes.io/projected/7e2a63b6-8ba6-4805-9a86-477196011cb3-kube-api-access-gq8h4\") pod \"octavia-6080-account-create-rrkhw\" (UID: \"7e2a63b6-8ba6-4805-9a86-477196011cb3\") " pod="openstack/octavia-6080-account-create-rrkhw" Oct 06 16:29:51 crc kubenswrapper[4763]: I1006 16:29:51.634082 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq8h4\" (UniqueName: \"kubernetes.io/projected/7e2a63b6-8ba6-4805-9a86-477196011cb3-kube-api-access-gq8h4\") pod \"octavia-6080-account-create-rrkhw\" (UID: \"7e2a63b6-8ba6-4805-9a86-477196011cb3\") " pod="openstack/octavia-6080-account-create-rrkhw" Oct 06 16:29:51 crc kubenswrapper[4763]: I1006 16:29:51.690686 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-6080-account-create-rrkhw" Oct 06 16:29:52 crc kubenswrapper[4763]: I1006 16:29:52.167785 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-6080-account-create-rrkhw"] Oct 06 16:29:52 crc kubenswrapper[4763]: I1006 16:29:52.525085 4763 generic.go:334] "Generic (PLEG): container finished" podID="7e2a63b6-8ba6-4805-9a86-477196011cb3" containerID="696dbf734d2dfb950954cd71adc736367d290fe7c655783de0fff888995a8be0" exitCode=0 Oct 06 16:29:52 crc kubenswrapper[4763]: I1006 16:29:52.525240 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-6080-account-create-rrkhw" event={"ID":"7e2a63b6-8ba6-4805-9a86-477196011cb3","Type":"ContainerDied","Data":"696dbf734d2dfb950954cd71adc736367d290fe7c655783de0fff888995a8be0"} Oct 06 16:29:52 crc kubenswrapper[4763]: I1006 16:29:52.525378 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-6080-account-create-rrkhw" event={"ID":"7e2a63b6-8ba6-4805-9a86-477196011cb3","Type":"ContainerStarted","Data":"70ba104dbeb3743f09ab643f4c952b3437222270f1fb5f539804030f6e7d8b3b"} Oct 06 16:29:53 crc kubenswrapper[4763]: I1006 16:29:53.942659 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-6080-account-create-rrkhw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.041233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq8h4\" (UniqueName: \"kubernetes.io/projected/7e2a63b6-8ba6-4805-9a86-477196011cb3-kube-api-access-gq8h4\") pod \"7e2a63b6-8ba6-4805-9a86-477196011cb3\" (UID: \"7e2a63b6-8ba6-4805-9a86-477196011cb3\") " Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.045858 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2a63b6-8ba6-4805-9a86-477196011cb3-kube-api-access-gq8h4" (OuterVolumeSpecName: "kube-api-access-gq8h4") pod "7e2a63b6-8ba6-4805-9a86-477196011cb3" (UID: "7e2a63b6-8ba6-4805-9a86-477196011cb3"). InnerVolumeSpecName "kube-api-access-gq8h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.143591 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq8h4\" (UniqueName: \"kubernetes.io/projected/7e2a63b6-8ba6-4805-9a86-477196011cb3-kube-api-access-gq8h4\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.403721 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-q54nb" podUID="50642abe-8228-47b7-9690-67d289d195a9" containerName="ovn-controller" probeResult="failure" output=< Oct 06 16:29:54 crc kubenswrapper[4763]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 06 16:29:54 crc kubenswrapper[4763]: > Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.407269 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.422088 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-msztv" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.540472 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-q54nb-config-7j6hw"] Oct 06 16:29:54 crc kubenswrapper[4763]: E1006 16:29:54.540995 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2a63b6-8ba6-4805-9a86-477196011cb3" containerName="mariadb-account-create" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.541018 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2a63b6-8ba6-4805-9a86-477196011cb3" containerName="mariadb-account-create" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.541300 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2a63b6-8ba6-4805-9a86-477196011cb3" containerName="mariadb-account-create" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.542123 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.544256 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.555300 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q54nb-config-7j6hw"] Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.565189 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-6080-account-create-rrkhw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.565810 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-6080-account-create-rrkhw" event={"ID":"7e2a63b6-8ba6-4805-9a86-477196011cb3","Type":"ContainerDied","Data":"70ba104dbeb3743f09ab643f4c952b3437222270f1fb5f539804030f6e7d8b3b"} Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.565849 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70ba104dbeb3743f09ab643f4c952b3437222270f1fb5f539804030f6e7d8b3b" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.652087 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.652151 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-log-ovn\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.652223 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75fzc\" (UniqueName: \"kubernetes.io/projected/e27477e5-a68e-464c-99bd-63e578881c82-kube-api-access-75fzc\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.652328 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run-ovn\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.652381 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-scripts\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.652408 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-additional-scripts\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.754164 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-log-ovn\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.754240 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75fzc\" (UniqueName: \"kubernetes.io/projected/e27477e5-a68e-464c-99bd-63e578881c82-kube-api-access-75fzc\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.754305 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run-ovn\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.754347 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-scripts\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.754374 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-additional-scripts\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.754427 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.754574 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-log-ovn\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.754634 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run-ovn\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.754679 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.755092 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-additional-scripts\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.756269 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-scripts\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.770959 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75fzc\" (UniqueName: \"kubernetes.io/projected/e27477e5-a68e-464c-99bd-63e578881c82-kube-api-access-75fzc\") pod \"ovn-controller-q54nb-config-7j6hw\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:54 crc kubenswrapper[4763]: I1006 16:29:54.867887 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:55 crc kubenswrapper[4763]: I1006 16:29:55.349198 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q54nb-config-7j6hw"] Oct 06 16:29:55 crc kubenswrapper[4763]: I1006 16:29:55.585758 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q54nb-config-7j6hw" event={"ID":"e27477e5-a68e-464c-99bd-63e578881c82","Type":"ContainerStarted","Data":"1440d159581e4b7c210ac95e3d2b823cd2cc28b3cf6820299704055cec28b6a5"} Oct 06 16:29:56 crc kubenswrapper[4763]: I1006 16:29:56.596855 4763 generic.go:334] "Generic (PLEG): container finished" podID="e27477e5-a68e-464c-99bd-63e578881c82" containerID="fef0a40dce06fb915401acb0a7473f371e68c456a8f45f689fb374e5fc4e9705" exitCode=0 Oct 06 16:29:56 crc kubenswrapper[4763]: I1006 16:29:56.597179 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q54nb-config-7j6hw" event={"ID":"e27477e5-a68e-464c-99bd-63e578881c82","Type":"ContainerDied","Data":"fef0a40dce06fb915401acb0a7473f371e68c456a8f45f689fb374e5fc4e9705"} Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.828232 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-84d8c674c4-6d5xr"] Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.831329 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.833561 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.834822 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-zfv98" Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.834941 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.846145 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-84d8c674c4-6d5xr"] Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.919570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/73edae2f-0842-49e6-b4ed-b05e3b60d53c-config-data-merged\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.919809 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73edae2f-0842-49e6-b4ed-b05e3b60d53c-combined-ca-bundle\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.919991 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73edae2f-0842-49e6-b4ed-b05e3b60d53c-scripts\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.920215 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/73edae2f-0842-49e6-b4ed-b05e3b60d53c-octavia-run\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:57 crc kubenswrapper[4763]: I1006 16:29:57.920254 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73edae2f-0842-49e6-b4ed-b05e3b60d53c-config-data\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.021962 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/73edae2f-0842-49e6-b4ed-b05e3b60d53c-octavia-run\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.022018 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73edae2f-0842-49e6-b4ed-b05e3b60d53c-config-data\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.022056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/73edae2f-0842-49e6-b4ed-b05e3b60d53c-config-data-merged\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.022111 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73edae2f-0842-49e6-b4ed-b05e3b60d53c-combined-ca-bundle\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.022191 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73edae2f-0842-49e6-b4ed-b05e3b60d53c-scripts\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.022690 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/73edae2f-0842-49e6-b4ed-b05e3b60d53c-octavia-run\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.022963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/73edae2f-0842-49e6-b4ed-b05e3b60d53c-config-data-merged\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.027907 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73edae2f-0842-49e6-b4ed-b05e3b60d53c-scripts\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.030075 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73edae2f-0842-49e6-b4ed-b05e3b60d53c-combined-ca-bundle\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.030399 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73edae2f-0842-49e6-b4ed-b05e3b60d53c-config-data\") pod \"octavia-api-84d8c674c4-6d5xr\" (UID: \"73edae2f-0842-49e6-b4ed-b05e3b60d53c\") " pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.100856 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.199083 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.225142 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-scripts\") pod \"e27477e5-a68e-464c-99bd-63e578881c82\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.225413 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-log-ovn\") pod \"e27477e5-a68e-464c-99bd-63e578881c82\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.225633 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run\") pod \"e27477e5-a68e-464c-99bd-63e578881c82\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.225764 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75fzc\" (UniqueName: \"kubernetes.io/projected/e27477e5-a68e-464c-99bd-63e578881c82-kube-api-access-75fzc\") pod \"e27477e5-a68e-464c-99bd-63e578881c82\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.225888 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run-ovn\") pod \"e27477e5-a68e-464c-99bd-63e578881c82\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.226250 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-additional-scripts\") pod \"e27477e5-a68e-464c-99bd-63e578881c82\" (UID: \"e27477e5-a68e-464c-99bd-63e578881c82\") " Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.228891 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e27477e5-a68e-464c-99bd-63e578881c82" (UID: "e27477e5-a68e-464c-99bd-63e578881c82"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.229985 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-scripts" (OuterVolumeSpecName: "scripts") pod "e27477e5-a68e-464c-99bd-63e578881c82" (UID: "e27477e5-a68e-464c-99bd-63e578881c82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.230117 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e27477e5-a68e-464c-99bd-63e578881c82" (UID: "e27477e5-a68e-464c-99bd-63e578881c82"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.230206 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run" (OuterVolumeSpecName: "var-run") pod "e27477e5-a68e-464c-99bd-63e578881c82" (UID: "e27477e5-a68e-464c-99bd-63e578881c82"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.232979 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e27477e5-a68e-464c-99bd-63e578881c82" (UID: "e27477e5-a68e-464c-99bd-63e578881c82"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.235746 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e27477e5-a68e-464c-99bd-63e578881c82-kube-api-access-75fzc" (OuterVolumeSpecName: "kube-api-access-75fzc") pod "e27477e5-a68e-464c-99bd-63e578881c82" (UID: "e27477e5-a68e-464c-99bd-63e578881c82"). InnerVolumeSpecName "kube-api-access-75fzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.328418 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.328458 4763 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.328471 4763 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.328503 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75fzc\" (UniqueName: \"kubernetes.io/projected/e27477e5-a68e-464c-99bd-63e578881c82-kube-api-access-75fzc\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.328518 4763 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e27477e5-a68e-464c-99bd-63e578881c82-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.328529 4763 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e27477e5-a68e-464c-99bd-63e578881c82-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.624972 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q54nb-config-7j6hw" event={"ID":"e27477e5-a68e-464c-99bd-63e578881c82","Type":"ContainerDied","Data":"1440d159581e4b7c210ac95e3d2b823cd2cc28b3cf6820299704055cec28b6a5"} Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.625802 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1440d159581e4b7c210ac95e3d2b823cd2cc28b3cf6820299704055cec28b6a5" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.625832 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q54nb-config-7j6hw" Oct 06 16:29:58 crc kubenswrapper[4763]: I1006 16:29:58.650926 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-84d8c674c4-6d5xr"] Oct 06 16:29:58 crc kubenswrapper[4763]: W1006 16:29:58.652808 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73edae2f_0842_49e6_b4ed_b05e3b60d53c.slice/crio-f41764ea529c126eb0ed3f1f28305d8b7c4bfd6dac76f161b5cd07cc40185308 WatchSource:0}: Error finding container f41764ea529c126eb0ed3f1f28305d8b7c4bfd6dac76f161b5cd07cc40185308: Status 404 returned error can't find the container with id f41764ea529c126eb0ed3f1f28305d8b7c4bfd6dac76f161b5cd07cc40185308 Oct 06 16:29:59 crc kubenswrapper[4763]: I1006 16:29:59.179351 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-q54nb-config-7j6hw"] Oct 06 16:29:59 crc kubenswrapper[4763]: I1006 16:29:59.187875 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-q54nb-config-7j6hw"] Oct 06 16:29:59 crc kubenswrapper[4763]: I1006 16:29:59.382381 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-q54nb" Oct 06 16:29:59 crc kubenswrapper[4763]: I1006 16:29:59.601511 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e27477e5-a68e-464c-99bd-63e578881c82" path="/var/lib/kubelet/pods/e27477e5-a68e-464c-99bd-63e578881c82/volumes" Oct 06 16:29:59 crc kubenswrapper[4763]: I1006 16:29:59.644333 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-84d8c674c4-6d5xr" event={"ID":"73edae2f-0842-49e6-b4ed-b05e3b60d53c","Type":"ContainerStarted","Data":"f41764ea529c126eb0ed3f1f28305d8b7c4bfd6dac76f161b5cd07cc40185308"} Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.144096 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n"] Oct 06 16:30:00 crc kubenswrapper[4763]: E1006 16:30:00.144900 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27477e5-a68e-464c-99bd-63e578881c82" containerName="ovn-config" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.144933 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27477e5-a68e-464c-99bd-63e578881c82" containerName="ovn-config" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.145177 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27477e5-a68e-464c-99bd-63e578881c82" containerName="ovn-config" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.146534 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.149332 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.149569 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.156954 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n"] Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.260151 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45855e5d-ec6c-429b-8792-ddd939f5b4db-secret-volume\") pod \"collect-profiles-29329470-h5p9n\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.260250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwj56\" (UniqueName: \"kubernetes.io/projected/45855e5d-ec6c-429b-8792-ddd939f5b4db-kube-api-access-wwj56\") pod \"collect-profiles-29329470-h5p9n\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.260295 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45855e5d-ec6c-429b-8792-ddd939f5b4db-config-volume\") pod \"collect-profiles-29329470-h5p9n\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.361789 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwj56\" (UniqueName: \"kubernetes.io/projected/45855e5d-ec6c-429b-8792-ddd939f5b4db-kube-api-access-wwj56\") pod \"collect-profiles-29329470-h5p9n\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.362186 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45855e5d-ec6c-429b-8792-ddd939f5b4db-config-volume\") pod \"collect-profiles-29329470-h5p9n\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.362272 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45855e5d-ec6c-429b-8792-ddd939f5b4db-secret-volume\") pod \"collect-profiles-29329470-h5p9n\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.363162 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45855e5d-ec6c-429b-8792-ddd939f5b4db-config-volume\") pod \"collect-profiles-29329470-h5p9n\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.369120 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45855e5d-ec6c-429b-8792-ddd939f5b4db-secret-volume\") pod \"collect-profiles-29329470-h5p9n\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.388486 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwj56\" (UniqueName: \"kubernetes.io/projected/45855e5d-ec6c-429b-8792-ddd939f5b4db-kube-api-access-wwj56\") pod \"collect-profiles-29329470-h5p9n\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.486271 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:00 crc kubenswrapper[4763]: W1006 16:30:00.964931 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45855e5d_ec6c_429b_8792_ddd939f5b4db.slice/crio-4c1a824fa948b3c4506335ae69962f7c058dc4531462ffd59d6d9361fa1af82d WatchSource:0}: Error finding container 4c1a824fa948b3c4506335ae69962f7c058dc4531462ffd59d6d9361fa1af82d: Status 404 returned error can't find the container with id 4c1a824fa948b3c4506335ae69962f7c058dc4531462ffd59d6d9361fa1af82d Oct 06 16:30:00 crc kubenswrapper[4763]: I1006 16:30:00.975390 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n"] Oct 06 16:30:01 crc kubenswrapper[4763]: I1006 16:30:01.671979 4763 generic.go:334] "Generic (PLEG): container finished" podID="45855e5d-ec6c-429b-8792-ddd939f5b4db" containerID="80d4a263df705624b3d550a8e9a69ccc5bfa95b4708485568795b58b7fe04f32" exitCode=0 Oct 06 16:30:01 crc kubenswrapper[4763]: I1006 16:30:01.672368 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" event={"ID":"45855e5d-ec6c-429b-8792-ddd939f5b4db","Type":"ContainerDied","Data":"80d4a263df705624b3d550a8e9a69ccc5bfa95b4708485568795b58b7fe04f32"} Oct 06 16:30:01 crc kubenswrapper[4763]: I1006 16:30:01.672402 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" event={"ID":"45855e5d-ec6c-429b-8792-ddd939f5b4db","Type":"ContainerStarted","Data":"4c1a824fa948b3c4506335ae69962f7c058dc4531462ffd59d6d9361fa1af82d"} Oct 06 16:30:03 crc kubenswrapper[4763]: I1006 16:30:03.876824 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:30:03 crc kubenswrapper[4763]: I1006 16:30:03.878382 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:30:03 crc kubenswrapper[4763]: I1006 16:30:03.878438 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 16:30:03 crc kubenswrapper[4763]: I1006 16:30:03.879951 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18239693b1bebb33d80ff60a66ab6865889de136729fdb80a6e4fbf4547b7272"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 16:30:03 crc kubenswrapper[4763]: I1006 16:30:03.880023 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://18239693b1bebb33d80ff60a66ab6865889de136729fdb80a6e4fbf4547b7272" gracePeriod=600 Oct 06 16:30:04 crc kubenswrapper[4763]: I1006 16:30:04.707940 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="18239693b1bebb33d80ff60a66ab6865889de136729fdb80a6e4fbf4547b7272" exitCode=0 Oct 06 16:30:04 crc kubenswrapper[4763]: I1006 16:30:04.707992 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"18239693b1bebb33d80ff60a66ab6865889de136729fdb80a6e4fbf4547b7272"} Oct 06 16:30:04 crc kubenswrapper[4763]: I1006 16:30:04.708088 4763 scope.go:117] "RemoveContainer" containerID="f6c7c8330bf0bced2975143c4953c9c7a3f0a99251a225b070b7e5febdebc15b" Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.648533 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.719219 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" event={"ID":"45855e5d-ec6c-429b-8792-ddd939f5b4db","Type":"ContainerDied","Data":"4c1a824fa948b3c4506335ae69962f7c058dc4531462ffd59d6d9361fa1af82d"} Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.719261 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c1a824fa948b3c4506335ae69962f7c058dc4531462ffd59d6d9361fa1af82d" Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.719693 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n" Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.782356 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45855e5d-ec6c-429b-8792-ddd939f5b4db-config-volume\") pod \"45855e5d-ec6c-429b-8792-ddd939f5b4db\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.782501 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwj56\" (UniqueName: \"kubernetes.io/projected/45855e5d-ec6c-429b-8792-ddd939f5b4db-kube-api-access-wwj56\") pod \"45855e5d-ec6c-429b-8792-ddd939f5b4db\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.782550 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45855e5d-ec6c-429b-8792-ddd939f5b4db-secret-volume\") pod \"45855e5d-ec6c-429b-8792-ddd939f5b4db\" (UID: \"45855e5d-ec6c-429b-8792-ddd939f5b4db\") " Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.783232 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45855e5d-ec6c-429b-8792-ddd939f5b4db-config-volume" (OuterVolumeSpecName: "config-volume") pod "45855e5d-ec6c-429b-8792-ddd939f5b4db" (UID: "45855e5d-ec6c-429b-8792-ddd939f5b4db"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.788267 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45855e5d-ec6c-429b-8792-ddd939f5b4db-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "45855e5d-ec6c-429b-8792-ddd939f5b4db" (UID: "45855e5d-ec6c-429b-8792-ddd939f5b4db"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.788919 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45855e5d-ec6c-429b-8792-ddd939f5b4db-kube-api-access-wwj56" (OuterVolumeSpecName: "kube-api-access-wwj56") pod "45855e5d-ec6c-429b-8792-ddd939f5b4db" (UID: "45855e5d-ec6c-429b-8792-ddd939f5b4db"). InnerVolumeSpecName "kube-api-access-wwj56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.884754 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45855e5d-ec6c-429b-8792-ddd939f5b4db-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.884790 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45855e5d-ec6c-429b-8792-ddd939f5b4db-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 16:30:05 crc kubenswrapper[4763]: I1006 16:30:05.884801 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwj56\" (UniqueName: \"kubernetes.io/projected/45855e5d-ec6c-429b-8792-ddd939f5b4db-kube-api-access-wwj56\") on node \"crc\" DevicePath \"\"" Oct 06 16:30:06 crc kubenswrapper[4763]: I1006 16:30:06.721390 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl"] Oct 06 16:30:06 crc kubenswrapper[4763]: I1006 16:30:06.731648 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329425-snntl"] Oct 06 16:30:07 crc kubenswrapper[4763]: I1006 16:30:07.586448 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d57bdf3-3c07-45d6-90bd-ccf26b207bf1" path="/var/lib/kubelet/pods/6d57bdf3-3c07-45d6-90bd-ccf26b207bf1/volumes" Oct 06 16:30:08 crc kubenswrapper[4763]: I1006 16:30:08.771966 4763 generic.go:334] "Generic (PLEG): container finished" podID="73edae2f-0842-49e6-b4ed-b05e3b60d53c" containerID="b5f1a67c5d15c71d76029aeb5eebe7d80c6482dc4eaff582f455e323ca601082" exitCode=0 Oct 06 16:30:08 crc kubenswrapper[4763]: I1006 16:30:08.772060 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-84d8c674c4-6d5xr" event={"ID":"73edae2f-0842-49e6-b4ed-b05e3b60d53c","Type":"ContainerDied","Data":"b5f1a67c5d15c71d76029aeb5eebe7d80c6482dc4eaff582f455e323ca601082"} Oct 06 16:30:08 crc kubenswrapper[4763]: I1006 16:30:08.776603 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428"} Oct 06 16:30:09 crc kubenswrapper[4763]: I1006 16:30:09.788656 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-84d8c674c4-6d5xr" event={"ID":"73edae2f-0842-49e6-b4ed-b05e3b60d53c","Type":"ContainerStarted","Data":"e7e8e3a1dd15af05a9dbf6a16c59450a120c68de5b2de8bfc09299791dd9b8ea"} Oct 06 16:30:09 crc kubenswrapper[4763]: I1006 16:30:09.789163 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-84d8c674c4-6d5xr" event={"ID":"73edae2f-0842-49e6-b4ed-b05e3b60d53c","Type":"ContainerStarted","Data":"40c97e1b3beee695fd2925579e4b74bb422b4f9568c528962b798e7e5e7f412b"} Oct 06 16:30:09 crc kubenswrapper[4763]: I1006 16:30:09.789266 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:30:09 crc kubenswrapper[4763]: I1006 16:30:09.817854 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-84d8c674c4-6d5xr" podStartSLOduration=3.840874903 podStartE2EDuration="12.81782807s" podCreationTimestamp="2025-10-06 16:29:57 +0000 UTC" firstStartedPulling="2025-10-06 16:29:58.655251414 +0000 UTC m=+5795.810543926" lastFinishedPulling="2025-10-06 16:30:07.632204581 +0000 UTC m=+5804.787497093" observedRunningTime="2025-10-06 16:30:09.81074192 +0000 UTC m=+5806.966034442" watchObservedRunningTime="2025-10-06 16:30:09.81782807 +0000 UTC m=+5806.973120582" Oct 06 16:30:10 crc kubenswrapper[4763]: I1006 16:30:10.798182 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.820264 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-5tscr"] Oct 06 16:30:17 crc kubenswrapper[4763]: E1006 16:30:17.821347 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45855e5d-ec6c-429b-8792-ddd939f5b4db" containerName="collect-profiles" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.821367 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="45855e5d-ec6c-429b-8792-ddd939f5b4db" containerName="collect-profiles" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.821599 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="45855e5d-ec6c-429b-8792-ddd939f5b4db" containerName="collect-profiles" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.822683 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.826282 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.830644 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-5tscr"] Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.835191 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.835385 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.974208 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5c494c30-afad-4e8e-89b7-702befa8ab06-config-data-merged\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.974762 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5c494c30-afad-4e8e-89b7-702befa8ab06-hm-ports\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.975084 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c494c30-afad-4e8e-89b7-702befa8ab06-config-data\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:17 crc kubenswrapper[4763]: I1006 16:30:17.975170 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c494c30-afad-4e8e-89b7-702befa8ab06-scripts\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.076794 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c494c30-afad-4e8e-89b7-702befa8ab06-config-data\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.077280 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c494c30-afad-4e8e-89b7-702befa8ab06-scripts\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.077505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5c494c30-afad-4e8e-89b7-702befa8ab06-config-data-merged\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.077734 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5c494c30-afad-4e8e-89b7-702befa8ab06-hm-ports\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.078219 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5c494c30-afad-4e8e-89b7-702befa8ab06-config-data-merged\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.078807 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5c494c30-afad-4e8e-89b7-702befa8ab06-hm-ports\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.085131 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c494c30-afad-4e8e-89b7-702befa8ab06-config-data\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.085755 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c494c30-afad-4e8e-89b7-702befa8ab06-scripts\") pod \"octavia-rsyslog-5tscr\" (UID: \"5c494c30-afad-4e8e-89b7-702befa8ab06\") " pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.146385 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.700443 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-446zz"] Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.702484 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-446zz" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.704751 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.716383 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-446zz"] Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.791608 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7337ad49-05bc-4a27-9ffa-12e032bd5e58-httpd-config\") pod \"octavia-image-upload-59f8cff499-446zz\" (UID: \"7337ad49-05bc-4a27-9ffa-12e032bd5e58\") " pod="openstack/octavia-image-upload-59f8cff499-446zz" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.791677 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/7337ad49-05bc-4a27-9ffa-12e032bd5e58-amphora-image\") pod \"octavia-image-upload-59f8cff499-446zz\" (UID: \"7337ad49-05bc-4a27-9ffa-12e032bd5e58\") " pod="openstack/octavia-image-upload-59f8cff499-446zz" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.802988 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-5tscr"] Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.883845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-5tscr" event={"ID":"5c494c30-afad-4e8e-89b7-702befa8ab06","Type":"ContainerStarted","Data":"7689171fb9ddccbc552fbc56add4073ac6cbfbb5918a212f18f8a8a4a8b92b4d"} Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.894312 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7337ad49-05bc-4a27-9ffa-12e032bd5e58-httpd-config\") pod \"octavia-image-upload-59f8cff499-446zz\" (UID: \"7337ad49-05bc-4a27-9ffa-12e032bd5e58\") " pod="openstack/octavia-image-upload-59f8cff499-446zz" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.894448 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/7337ad49-05bc-4a27-9ffa-12e032bd5e58-amphora-image\") pod \"octavia-image-upload-59f8cff499-446zz\" (UID: \"7337ad49-05bc-4a27-9ffa-12e032bd5e58\") " pod="openstack/octavia-image-upload-59f8cff499-446zz" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.895956 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/7337ad49-05bc-4a27-9ffa-12e032bd5e58-amphora-image\") pod \"octavia-image-upload-59f8cff499-446zz\" (UID: \"7337ad49-05bc-4a27-9ffa-12e032bd5e58\") " pod="openstack/octavia-image-upload-59f8cff499-446zz" Oct 06 16:30:18 crc kubenswrapper[4763]: I1006 16:30:18.909781 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7337ad49-05bc-4a27-9ffa-12e032bd5e58-httpd-config\") pod \"octavia-image-upload-59f8cff499-446zz\" (UID: \"7337ad49-05bc-4a27-9ffa-12e032bd5e58\") " pod="openstack/octavia-image-upload-59f8cff499-446zz" Oct 06 16:30:19 crc kubenswrapper[4763]: I1006 16:30:19.050394 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-446zz" Oct 06 16:30:19 crc kubenswrapper[4763]: I1006 16:30:19.600564 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-446zz"] Oct 06 16:30:19 crc kubenswrapper[4763]: I1006 16:30:19.893029 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-446zz" event={"ID":"7337ad49-05bc-4a27-9ffa-12e032bd5e58","Type":"ContainerStarted","Data":"eea63e6b71bd979e938c7b7e8a645513f347c4a53791ae639d24e8b84be5e808"} Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.164248 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-g858t"] Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.170232 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.173638 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.179516 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-g858t"] Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.320897 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.320968 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-scripts\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.321079 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data-merged\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.321149 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-combined-ca-bundle\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.422530 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data-merged\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.422647 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-combined-ca-bundle\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.422732 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.422756 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-scripts\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.423145 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data-merged\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.428265 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-combined-ca-bundle\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.429280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-scripts\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.429568 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data\") pod \"octavia-db-sync-g858t\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.504339 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.910596 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-5tscr" event={"ID":"5c494c30-afad-4e8e-89b7-702befa8ab06","Type":"ContainerStarted","Data":"e3ffac594cf8630e90c7eb75e34062d69764f90bddbc93981a9c6e535ca460f8"} Oct 06 16:30:20 crc kubenswrapper[4763]: I1006 16:30:20.980137 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-g858t"] Oct 06 16:30:21 crc kubenswrapper[4763]: I1006 16:30:21.927472 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g858t" event={"ID":"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0","Type":"ContainerStarted","Data":"98122a257f48f18dc1d6850af06c96b1cabb3743ce127d9366568a6fd0ef3400"} Oct 06 16:30:22 crc kubenswrapper[4763]: I1006 16:30:22.938562 4763 generic.go:334] "Generic (PLEG): container finished" podID="5c494c30-afad-4e8e-89b7-702befa8ab06" containerID="e3ffac594cf8630e90c7eb75e34062d69764f90bddbc93981a9c6e535ca460f8" exitCode=0 Oct 06 16:30:22 crc kubenswrapper[4763]: I1006 16:30:22.938649 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-5tscr" event={"ID":"5c494c30-afad-4e8e-89b7-702befa8ab06","Type":"ContainerDied","Data":"e3ffac594cf8630e90c7eb75e34062d69764f90bddbc93981a9c6e535ca460f8"} Oct 06 16:30:22 crc kubenswrapper[4763]: I1006 16:30:22.941982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g858t" event={"ID":"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0","Type":"ContainerStarted","Data":"b28a1e8381562c7f40031d57470887f593974afb65c87f52c5fc196a9c48292c"} Oct 06 16:30:23 crc kubenswrapper[4763]: I1006 16:30:23.951488 4763 generic.go:334] "Generic (PLEG): container finished" podID="4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" containerID="b28a1e8381562c7f40031d57470887f593974afb65c87f52c5fc196a9c48292c" exitCode=0 Oct 06 16:30:23 crc kubenswrapper[4763]: I1006 16:30:23.951601 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g858t" event={"ID":"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0","Type":"ContainerDied","Data":"b28a1e8381562c7f40031d57470887f593974afb65c87f52c5fc196a9c48292c"} Oct 06 16:30:24 crc kubenswrapper[4763]: I1006 16:30:24.964491 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g858t" event={"ID":"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0","Type":"ContainerStarted","Data":"04dfb7716644033f1416b3528f59348236e173d8feeff77279d1e50a2565b39d"} Oct 06 16:30:24 crc kubenswrapper[4763]: I1006 16:30:24.991295 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-g858t" podStartSLOduration=4.991276588 podStartE2EDuration="4.991276588s" podCreationTimestamp="2025-10-06 16:30:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:30:24.989991623 +0000 UTC m=+5822.145284165" watchObservedRunningTime="2025-10-06 16:30:24.991276588 +0000 UTC m=+5822.146569110" Oct 06 16:30:28 crc kubenswrapper[4763]: I1006 16:30:28.026563 4763 generic.go:334] "Generic (PLEG): container finished" podID="4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" containerID="04dfb7716644033f1416b3528f59348236e173d8feeff77279d1e50a2565b39d" exitCode=0 Oct 06 16:30:28 crc kubenswrapper[4763]: I1006 16:30:28.026739 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g858t" event={"ID":"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0","Type":"ContainerDied","Data":"04dfb7716644033f1416b3528f59348236e173d8feeff77279d1e50a2565b39d"} Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.037521 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-5tscr" event={"ID":"5c494c30-afad-4e8e-89b7-702befa8ab06","Type":"ContainerStarted","Data":"4b552b48c92060b815828ce25baeb0ec103c35213e38e6ccacd3307277a7b2b3"} Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.038043 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.040542 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-446zz" event={"ID":"7337ad49-05bc-4a27-9ffa-12e032bd5e58","Type":"ContainerStarted","Data":"7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430"} Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.073978 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-5tscr" podStartSLOduration=2.627755543 podStartE2EDuration="12.073954272s" podCreationTimestamp="2025-10-06 16:30:17 +0000 UTC" firstStartedPulling="2025-10-06 16:30:18.801835055 +0000 UTC m=+5815.957127567" lastFinishedPulling="2025-10-06 16:30:28.248033744 +0000 UTC m=+5825.403326296" observedRunningTime="2025-10-06 16:30:29.067197791 +0000 UTC m=+5826.222490343" watchObservedRunningTime="2025-10-06 16:30:29.073954272 +0000 UTC m=+5826.229246794" Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.501327 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.637949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-combined-ca-bundle\") pod \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.638077 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data-merged\") pod \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.638169 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-scripts\") pod \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.638284 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data\") pod \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\" (UID: \"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0\") " Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.643783 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data" (OuterVolumeSpecName: "config-data") pod "4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" (UID: "4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.648777 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-scripts" (OuterVolumeSpecName: "scripts") pod "4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" (UID: "4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.669573 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" (UID: "4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.670938 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" (UID: "4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.741293 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.741346 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.741359 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:30:29 crc kubenswrapper[4763]: I1006 16:30:29.741373 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0-config-data-merged\") on node \"crc\" DevicePath \"\"" Oct 06 16:30:30 crc kubenswrapper[4763]: I1006 16:30:30.050039 4763 generic.go:334] "Generic (PLEG): container finished" podID="7337ad49-05bc-4a27-9ffa-12e032bd5e58" containerID="7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430" exitCode=0 Oct 06 16:30:30 crc kubenswrapper[4763]: I1006 16:30:30.050208 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-446zz" event={"ID":"7337ad49-05bc-4a27-9ffa-12e032bd5e58","Type":"ContainerDied","Data":"7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430"} Oct 06 16:30:30 crc kubenswrapper[4763]: I1006 16:30:30.052663 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-g858t" Oct 06 16:30:30 crc kubenswrapper[4763]: I1006 16:30:30.052654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g858t" event={"ID":"4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0","Type":"ContainerDied","Data":"98122a257f48f18dc1d6850af06c96b1cabb3743ce127d9366568a6fd0ef3400"} Oct 06 16:30:30 crc kubenswrapper[4763]: I1006 16:30:30.052708 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98122a257f48f18dc1d6850af06c96b1cabb3743ce127d9366568a6fd0ef3400" Oct 06 16:30:30 crc kubenswrapper[4763]: I1006 16:30:30.588714 4763 scope.go:117] "RemoveContainer" containerID="79fce7244c5f011bdb4c39676f0c42d9d25cf68d150a30d747df6292b4de1e6e" Oct 06 16:30:32 crc kubenswrapper[4763]: I1006 16:30:32.565402 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:30:32 crc kubenswrapper[4763]: I1006 16:30:32.566027 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-84d8c674c4-6d5xr" Oct 06 16:30:33 crc kubenswrapper[4763]: I1006 16:30:33.183589 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-5tscr" Oct 06 16:30:35 crc kubenswrapper[4763]: I1006 16:30:35.103035 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-446zz" event={"ID":"7337ad49-05bc-4a27-9ffa-12e032bd5e58","Type":"ContainerStarted","Data":"90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c"} Oct 06 16:30:53 crc kubenswrapper[4763]: I1006 16:30:53.032187 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-446zz" podStartSLOduration=20.590761075 podStartE2EDuration="35.032157258s" podCreationTimestamp="2025-10-06 16:30:18 +0000 UTC" firstStartedPulling="2025-10-06 16:30:19.594806261 +0000 UTC m=+5816.750098773" lastFinishedPulling="2025-10-06 16:30:34.036202444 +0000 UTC m=+5831.191494956" observedRunningTime="2025-10-06 16:30:35.138881069 +0000 UTC m=+5832.294173581" watchObservedRunningTime="2025-10-06 16:30:53.032157258 +0000 UTC m=+5850.187449810" Oct 06 16:30:53 crc kubenswrapper[4763]: I1006 16:30:53.042444 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-zqmcj"] Oct 06 16:30:53 crc kubenswrapper[4763]: I1006 16:30:53.051385 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-zqmcj"] Oct 06 16:30:53 crc kubenswrapper[4763]: I1006 16:30:53.585985 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08632e8e-5a58-413d-a3ba-c9454181600d" path="/var/lib/kubelet/pods/08632e8e-5a58-413d-a3ba-c9454181600d/volumes" Oct 06 16:30:58 crc kubenswrapper[4763]: I1006 16:30:58.835254 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-446zz"] Oct 06 16:30:58 crc kubenswrapper[4763]: I1006 16:30:58.836010 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-446zz" podUID="7337ad49-05bc-4a27-9ffa-12e032bd5e58" containerName="octavia-amphora-httpd" containerID="cri-o://90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c" gracePeriod=30 Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.344189 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-446zz" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.391451 4763 generic.go:334] "Generic (PLEG): container finished" podID="7337ad49-05bc-4a27-9ffa-12e032bd5e58" containerID="90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c" exitCode=0 Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.391516 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-446zz" event={"ID":"7337ad49-05bc-4a27-9ffa-12e032bd5e58","Type":"ContainerDied","Data":"90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c"} Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.391548 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-446zz" event={"ID":"7337ad49-05bc-4a27-9ffa-12e032bd5e58","Type":"ContainerDied","Data":"eea63e6b71bd979e938c7b7e8a645513f347c4a53791ae639d24e8b84be5e808"} Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.391549 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-446zz" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.391573 4763 scope.go:117] "RemoveContainer" containerID="90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.425249 4763 scope.go:117] "RemoveContainer" containerID="7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.451215 4763 scope.go:117] "RemoveContainer" containerID="90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c" Oct 06 16:30:59 crc kubenswrapper[4763]: E1006 16:30:59.453937 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c\": container with ID starting with 90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c not found: ID does not exist" containerID="90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.454007 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c"} err="failed to get container status \"90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c\": rpc error: code = NotFound desc = could not find container \"90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c\": container with ID starting with 90b6ccf9fdfb1559a1e3e3902686a8c7fcb4b0129a62db7bc0045313dc454c7c not found: ID does not exist" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.454032 4763 scope.go:117] "RemoveContainer" containerID="7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430" Oct 06 16:30:59 crc kubenswrapper[4763]: E1006 16:30:59.454340 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430\": container with ID starting with 7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430 not found: ID does not exist" containerID="7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.454430 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430"} err="failed to get container status \"7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430\": rpc error: code = NotFound desc = could not find container \"7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430\": container with ID starting with 7242a4e22a1b77e036d4dfc0d483500dfc2e595f9235d31d73e0ab243cf6e430 not found: ID does not exist" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.470721 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/7337ad49-05bc-4a27-9ffa-12e032bd5e58-amphora-image\") pod \"7337ad49-05bc-4a27-9ffa-12e032bd5e58\" (UID: \"7337ad49-05bc-4a27-9ffa-12e032bd5e58\") " Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.471113 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7337ad49-05bc-4a27-9ffa-12e032bd5e58-httpd-config\") pod \"7337ad49-05bc-4a27-9ffa-12e032bd5e58\" (UID: \"7337ad49-05bc-4a27-9ffa-12e032bd5e58\") " Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.509902 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7337ad49-05bc-4a27-9ffa-12e032bd5e58-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7337ad49-05bc-4a27-9ffa-12e032bd5e58" (UID: "7337ad49-05bc-4a27-9ffa-12e032bd5e58"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.571628 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7337ad49-05bc-4a27-9ffa-12e032bd5e58-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "7337ad49-05bc-4a27-9ffa-12e032bd5e58" (UID: "7337ad49-05bc-4a27-9ffa-12e032bd5e58"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.573461 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7337ad49-05bc-4a27-9ffa-12e032bd5e58-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.573506 4763 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/7337ad49-05bc-4a27-9ffa-12e032bd5e58-amphora-image\") on node \"crc\" DevicePath \"\"" Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.718211 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-446zz"] Oct 06 16:30:59 crc kubenswrapper[4763]: I1006 16:30:59.728107 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-446zz"] Oct 06 16:31:01 crc kubenswrapper[4763]: I1006 16:31:01.593569 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7337ad49-05bc-4a27-9ffa-12e032bd5e58" path="/var/lib/kubelet/pods/7337ad49-05bc-4a27-9ffa-12e032bd5e58/volumes" Oct 06 16:31:03 crc kubenswrapper[4763]: I1006 16:31:03.030400 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6537-account-create-f2w6n"] Oct 06 16:31:03 crc kubenswrapper[4763]: I1006 16:31:03.042072 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6537-account-create-f2w6n"] Oct 06 16:31:03 crc kubenswrapper[4763]: I1006 16:31:03.589754 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85fab8df-dad8-437d-98cb-423007ec4737" path="/var/lib/kubelet/pods/85fab8df-dad8-437d-98cb-423007ec4737/volumes" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.375407 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-s7xd5"] Oct 06 16:31:05 crc kubenswrapper[4763]: E1006 16:31:05.376135 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7337ad49-05bc-4a27-9ffa-12e032bd5e58" containerName="init" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.376150 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7337ad49-05bc-4a27-9ffa-12e032bd5e58" containerName="init" Oct 06 16:31:05 crc kubenswrapper[4763]: E1006 16:31:05.376165 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" containerName="init" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.376171 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" containerName="init" Oct 06 16:31:05 crc kubenswrapper[4763]: E1006 16:31:05.376192 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7337ad49-05bc-4a27-9ffa-12e032bd5e58" containerName="octavia-amphora-httpd" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.376199 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7337ad49-05bc-4a27-9ffa-12e032bd5e58" containerName="octavia-amphora-httpd" Oct 06 16:31:05 crc kubenswrapper[4763]: E1006 16:31:05.376219 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" containerName="octavia-db-sync" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.376225 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" containerName="octavia-db-sync" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.376393 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7337ad49-05bc-4a27-9ffa-12e032bd5e58" containerName="octavia-amphora-httpd" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.376414 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" containerName="octavia-db-sync" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.377573 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-s7xd5" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.380322 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.390336 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-s7xd5"] Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.497717 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2686aaa-9fa1-43dd-a3af-536f1e462f28-httpd-config\") pod \"octavia-image-upload-59f8cff499-s7xd5\" (UID: \"b2686aaa-9fa1-43dd-a3af-536f1e462f28\") " pod="openstack/octavia-image-upload-59f8cff499-s7xd5" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.497883 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b2686aaa-9fa1-43dd-a3af-536f1e462f28-amphora-image\") pod \"octavia-image-upload-59f8cff499-s7xd5\" (UID: \"b2686aaa-9fa1-43dd-a3af-536f1e462f28\") " pod="openstack/octavia-image-upload-59f8cff499-s7xd5" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.599654 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2686aaa-9fa1-43dd-a3af-536f1e462f28-httpd-config\") pod \"octavia-image-upload-59f8cff499-s7xd5\" (UID: \"b2686aaa-9fa1-43dd-a3af-536f1e462f28\") " pod="openstack/octavia-image-upload-59f8cff499-s7xd5" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.599773 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b2686aaa-9fa1-43dd-a3af-536f1e462f28-amphora-image\") pod \"octavia-image-upload-59f8cff499-s7xd5\" (UID: \"b2686aaa-9fa1-43dd-a3af-536f1e462f28\") " pod="openstack/octavia-image-upload-59f8cff499-s7xd5" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.600253 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b2686aaa-9fa1-43dd-a3af-536f1e462f28-amphora-image\") pod \"octavia-image-upload-59f8cff499-s7xd5\" (UID: \"b2686aaa-9fa1-43dd-a3af-536f1e462f28\") " pod="openstack/octavia-image-upload-59f8cff499-s7xd5" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.608142 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2686aaa-9fa1-43dd-a3af-536f1e462f28-httpd-config\") pod \"octavia-image-upload-59f8cff499-s7xd5\" (UID: \"b2686aaa-9fa1-43dd-a3af-536f1e462f28\") " pod="openstack/octavia-image-upload-59f8cff499-s7xd5" Oct 06 16:31:05 crc kubenswrapper[4763]: I1006 16:31:05.706639 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-s7xd5" Oct 06 16:31:06 crc kubenswrapper[4763]: I1006 16:31:06.185827 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-s7xd5"] Oct 06 16:31:06 crc kubenswrapper[4763]: I1006 16:31:06.475918 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-s7xd5" event={"ID":"b2686aaa-9fa1-43dd-a3af-536f1e462f28","Type":"ContainerStarted","Data":"f40b6fde94469eb614c2fe80e86f25e197e22e7a7debba65c9707f96cab09e22"} Oct 06 16:31:08 crc kubenswrapper[4763]: I1006 16:31:08.499758 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-s7xd5" event={"ID":"b2686aaa-9fa1-43dd-a3af-536f1e462f28","Type":"ContainerStarted","Data":"006d5595f8926191608f20729f749c93dfef0985122ba12d3e818ea377bd5e01"} Oct 06 16:31:10 crc kubenswrapper[4763]: I1006 16:31:10.040779 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ng4zk"] Oct 06 16:31:10 crc kubenswrapper[4763]: I1006 16:31:10.050680 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ng4zk"] Oct 06 16:31:11 crc kubenswrapper[4763]: I1006 16:31:11.529888 4763 generic.go:334] "Generic (PLEG): container finished" podID="b2686aaa-9fa1-43dd-a3af-536f1e462f28" containerID="006d5595f8926191608f20729f749c93dfef0985122ba12d3e818ea377bd5e01" exitCode=0 Oct 06 16:31:11 crc kubenswrapper[4763]: I1006 16:31:11.529932 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-s7xd5" event={"ID":"b2686aaa-9fa1-43dd-a3af-536f1e462f28","Type":"ContainerDied","Data":"006d5595f8926191608f20729f749c93dfef0985122ba12d3e818ea377bd5e01"} Oct 06 16:31:11 crc kubenswrapper[4763]: I1006 16:31:11.586810 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57c8a64f-69a3-41ef-9707-70aeede40d32" path="/var/lib/kubelet/pods/57c8a64f-69a3-41ef-9707-70aeede40d32/volumes" Oct 06 16:31:13 crc kubenswrapper[4763]: I1006 16:31:13.556084 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-s7xd5" event={"ID":"b2686aaa-9fa1-43dd-a3af-536f1e462f28","Type":"ContainerStarted","Data":"783c32fb9fc2998e76420809e15b14d2acde132aba2ba88cb5b667f9733c0fbe"} Oct 06 16:31:13 crc kubenswrapper[4763]: I1006 16:31:13.581260 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-s7xd5" podStartSLOduration=2.234390973 podStartE2EDuration="8.581240165s" podCreationTimestamp="2025-10-06 16:31:05 +0000 UTC" firstStartedPulling="2025-10-06 16:31:06.192728804 +0000 UTC m=+5863.348021316" lastFinishedPulling="2025-10-06 16:31:12.539577986 +0000 UTC m=+5869.694870508" observedRunningTime="2025-10-06 16:31:13.573601747 +0000 UTC m=+5870.728894299" watchObservedRunningTime="2025-10-06 16:31:13.581240165 +0000 UTC m=+5870.736532677" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.121950 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-rp9zr"] Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.138718 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-rp9zr"] Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.138820 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.148507 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.149250 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.149452 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.216652 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-scripts\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.216744 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-amphora-certs\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.216826 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-combined-ca-bundle\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.216867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-config-data\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.216916 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-config-data-merged\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.217008 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-hm-ports\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.318207 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-scripts\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.318262 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-amphora-certs\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.318323 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-combined-ca-bundle\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.318363 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-config-data\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.318419 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-config-data-merged\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.318455 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-hm-ports\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.319482 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-config-data-merged\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.319787 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-hm-ports\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.324209 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-amphora-certs\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.324999 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-config-data\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.332251 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-combined-ca-bundle\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.333590 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0287ddec-90d4-421c-b1fc-46d9f52f3b0d-scripts\") pod \"octavia-healthmanager-rp9zr\" (UID: \"0287ddec-90d4-421c-b1fc-46d9f52f3b0d\") " pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:25 crc kubenswrapper[4763]: I1006 16:31:25.458292 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:26 crc kubenswrapper[4763]: I1006 16:31:26.025308 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-rp9zr"] Oct 06 16:31:26 crc kubenswrapper[4763]: I1006 16:31:26.689132 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-rp9zr" event={"ID":"0287ddec-90d4-421c-b1fc-46d9f52f3b0d","Type":"ContainerStarted","Data":"1154918fe584eb26be51a60439823236887136bfc2413646a0f38ccc408e99d6"} Oct 06 16:31:26 crc kubenswrapper[4763]: I1006 16:31:26.689476 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-rp9zr" event={"ID":"0287ddec-90d4-421c-b1fc-46d9f52f3b0d","Type":"ContainerStarted","Data":"0191f8be87818f86bd637bf3cf319b94eb35e06baf6eeda46a082a95d07ea26a"} Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.268819 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-qldvm"] Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.272113 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.274480 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.275256 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.298762 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-qldvm"] Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.370473 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-hm-ports\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.370524 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-amphora-certs\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.370677 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-combined-ca-bundle\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.370804 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-config-data\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.371037 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-scripts\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.371168 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-config-data-merged\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.472993 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-hm-ports\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.473077 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-amphora-certs\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.473129 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-combined-ca-bundle\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.473177 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-config-data\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.473240 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-scripts\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.473294 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-config-data-merged\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.473895 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-config-data-merged\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.474122 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-hm-ports\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.478839 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-config-data\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.479189 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-combined-ca-bundle\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.496319 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-amphora-certs\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.497084 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9-scripts\") pod \"octavia-housekeeping-qldvm\" (UID: \"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9\") " pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:27 crc kubenswrapper[4763]: I1006 16:31:27.593731 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:28 crc kubenswrapper[4763]: I1006 16:31:28.190683 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-qldvm"] Oct 06 16:31:28 crc kubenswrapper[4763]: I1006 16:31:28.722463 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qldvm" event={"ID":"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9","Type":"ContainerStarted","Data":"137aef6a05161ef91ff56637b86a5cca67e4255cc9f7a496fe8c19a4619a35e9"} Oct 06 16:31:28 crc kubenswrapper[4763]: I1006 16:31:28.724302 4763 generic.go:334] "Generic (PLEG): container finished" podID="0287ddec-90d4-421c-b1fc-46d9f52f3b0d" containerID="1154918fe584eb26be51a60439823236887136bfc2413646a0f38ccc408e99d6" exitCode=0 Oct 06 16:31:28 crc kubenswrapper[4763]: I1006 16:31:28.724329 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-rp9zr" event={"ID":"0287ddec-90d4-421c-b1fc-46d9f52f3b0d","Type":"ContainerDied","Data":"1154918fe584eb26be51a60439823236887136bfc2413646a0f38ccc408e99d6"} Oct 06 16:31:29 crc kubenswrapper[4763]: I1006 16:31:29.740591 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-rp9zr" event={"ID":"0287ddec-90d4-421c-b1fc-46d9f52f3b0d","Type":"ContainerStarted","Data":"5aa11f954e355048bd87629a1224de318d4bc9a9b2b1d52d15e6d3613dc469c0"} Oct 06 16:31:29 crc kubenswrapper[4763]: I1006 16:31:29.742755 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:29 crc kubenswrapper[4763]: I1006 16:31:29.763216 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-rp9zr" podStartSLOduration=4.763189299 podStartE2EDuration="4.763189299s" podCreationTimestamp="2025-10-06 16:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:31:29.759810647 +0000 UTC m=+5886.915103229" watchObservedRunningTime="2025-10-06 16:31:29.763189299 +0000 UTC m=+5886.918481851" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.706773 4763 scope.go:117] "RemoveContainer" containerID="b947f9d183450207b2e2827dd995f9c8a3e5abe709b10a68a2ff9d80b32661e6" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.753330 4763 scope.go:117] "RemoveContainer" containerID="a133221f5356490e85eccdbf7b8fa1e87c7929a9574b1f6e919ee4a000439dcc" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.756278 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qldvm" event={"ID":"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9","Type":"ContainerStarted","Data":"849a2f80a9b45a8e2e7b190a245c394ecee2388665a7f650708f2bf1916b2932"} Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.796963 4763 scope.go:117] "RemoveContainer" containerID="ca536d3678c9897d02e0297702fce2291ee82ac35c6ebace1c9bdb771901fb3d" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.825103 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-5kqgr"] Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.827098 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.836999 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-5kqgr"] Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.843177 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.843315 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.945320 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/87279dd8-575d-4c81-a248-ae62c52f8c24-config-data-merged\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.945364 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-amphora-certs\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.945409 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-scripts\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.945536 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/87279dd8-575d-4c81-a248-ae62c52f8c24-hm-ports\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.945574 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-config-data\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.945636 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-combined-ca-bundle\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:30 crc kubenswrapper[4763]: I1006 16:31:30.969005 4763 scope.go:117] "RemoveContainer" containerID="d7d8bcf1a880e9e944d29c9ecca87acb33ee02388f87ffefec835e182dfbd073" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.006384 4763 scope.go:117] "RemoveContainer" containerID="179a291999c36f551433be402096687932bbb292cc0bfb73cd3377bec7f81ef1" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.047509 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/87279dd8-575d-4c81-a248-ae62c52f8c24-hm-ports\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.047577 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-config-data\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.047651 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-combined-ca-bundle\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.047680 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/87279dd8-575d-4c81-a248-ae62c52f8c24-config-data-merged\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.047704 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-amphora-certs\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.047744 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-scripts\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.051998 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/87279dd8-575d-4c81-a248-ae62c52f8c24-config-data-merged\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.054694 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-scripts\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.055264 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-combined-ca-bundle\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.055438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-config-data\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.056253 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/87279dd8-575d-4c81-a248-ae62c52f8c24-amphora-certs\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.066068 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/87279dd8-575d-4c81-a248-ae62c52f8c24-hm-ports\") pod \"octavia-worker-5kqgr\" (UID: \"87279dd8-575d-4c81-a248-ae62c52f8c24\") " pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.254119 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.769212 4763 generic.go:334] "Generic (PLEG): container finished" podID="2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9" containerID="849a2f80a9b45a8e2e7b190a245c394ecee2388665a7f650708f2bf1916b2932" exitCode=0 Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.769432 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qldvm" event={"ID":"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9","Type":"ContainerDied","Data":"849a2f80a9b45a8e2e7b190a245c394ecee2388665a7f650708f2bf1916b2932"} Oct 06 16:31:31 crc kubenswrapper[4763]: I1006 16:31:31.805661 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-5kqgr"] Oct 06 16:31:32 crc kubenswrapper[4763]: I1006 16:31:32.784243 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5kqgr" event={"ID":"87279dd8-575d-4c81-a248-ae62c52f8c24","Type":"ContainerStarted","Data":"24d4218d4996f12f8cd7e13bb6930587db5ecda28180e35d3b6abaddba249c10"} Oct 06 16:31:32 crc kubenswrapper[4763]: I1006 16:31:32.788175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qldvm" event={"ID":"2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9","Type":"ContainerStarted","Data":"63dca3e68f35a28e61a101a60ed2ca7a5080d2eb501007627b914630eaad0a79"} Oct 06 16:31:32 crc kubenswrapper[4763]: I1006 16:31:32.789562 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:32 crc kubenswrapper[4763]: I1006 16:31:32.811390 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-qldvm" podStartSLOduration=4.437131508 podStartE2EDuration="5.811365119s" podCreationTimestamp="2025-10-06 16:31:27 +0000 UTC" firstStartedPulling="2025-10-06 16:31:28.198459875 +0000 UTC m=+5885.353752397" lastFinishedPulling="2025-10-06 16:31:29.572693496 +0000 UTC m=+5886.727986008" observedRunningTime="2025-10-06 16:31:32.810871705 +0000 UTC m=+5889.966164257" watchObservedRunningTime="2025-10-06 16:31:32.811365119 +0000 UTC m=+5889.966657651" Oct 06 16:31:34 crc kubenswrapper[4763]: I1006 16:31:34.805497 4763 generic.go:334] "Generic (PLEG): container finished" podID="87279dd8-575d-4c81-a248-ae62c52f8c24" containerID="9990765a4048bcd64a516c11d34c77c519774961b532b73ca08efae5bc2f08d7" exitCode=0 Oct 06 16:31:34 crc kubenswrapper[4763]: I1006 16:31:34.805577 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5kqgr" event={"ID":"87279dd8-575d-4c81-a248-ae62c52f8c24","Type":"ContainerDied","Data":"9990765a4048bcd64a516c11d34c77c519774961b532b73ca08efae5bc2f08d7"} Oct 06 16:31:35 crc kubenswrapper[4763]: I1006 16:31:35.818862 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5kqgr" event={"ID":"87279dd8-575d-4c81-a248-ae62c52f8c24","Type":"ContainerStarted","Data":"aa43166325b16b9c9a88ac27ceb98b4b31baa217cfd042a27eddff78fd094e28"} Oct 06 16:31:35 crc kubenswrapper[4763]: I1006 16:31:35.819349 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:35 crc kubenswrapper[4763]: I1006 16:31:35.837250 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-5kqgr" podStartSLOduration=4.290699182 podStartE2EDuration="5.837227782s" podCreationTimestamp="2025-10-06 16:31:30 +0000 UTC" firstStartedPulling="2025-10-06 16:31:31.81242743 +0000 UTC m=+5888.967719952" lastFinishedPulling="2025-10-06 16:31:33.35895604 +0000 UTC m=+5890.514248552" observedRunningTime="2025-10-06 16:31:35.835520566 +0000 UTC m=+5892.990813078" watchObservedRunningTime="2025-10-06 16:31:35.837227782 +0000 UTC m=+5892.992520294" Oct 06 16:31:40 crc kubenswrapper[4763]: I1006 16:31:40.046957 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fl2l9"] Oct 06 16:31:40 crc kubenswrapper[4763]: I1006 16:31:40.055607 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fl2l9"] Oct 06 16:31:40 crc kubenswrapper[4763]: I1006 16:31:40.491214 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-rp9zr" Oct 06 16:31:41 crc kubenswrapper[4763]: I1006 16:31:41.590063 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd968ee-8b72-4352-b0d9-0ef485399021" path="/var/lib/kubelet/pods/3cd968ee-8b72-4352-b0d9-0ef485399021/volumes" Oct 06 16:31:42 crc kubenswrapper[4763]: I1006 16:31:42.626316 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-qldvm" Oct 06 16:31:46 crc kubenswrapper[4763]: I1006 16:31:46.286534 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-5kqgr" Oct 06 16:31:50 crc kubenswrapper[4763]: I1006 16:31:50.043777 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a665-account-create-5jwsj"] Oct 06 16:31:50 crc kubenswrapper[4763]: I1006 16:31:50.054285 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a665-account-create-5jwsj"] Oct 06 16:31:51 crc kubenswrapper[4763]: I1006 16:31:51.593201 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ece696e-b1c4-41e9-8067-7f86114e4cbe" path="/var/lib/kubelet/pods/5ece696e-b1c4-41e9-8067-7f86114e4cbe/volumes" Oct 06 16:31:58 crc kubenswrapper[4763]: I1006 16:31:58.028391 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kf4p9"] Oct 06 16:31:58 crc kubenswrapper[4763]: I1006 16:31:58.041645 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kf4p9"] Oct 06 16:31:59 crc kubenswrapper[4763]: I1006 16:31:59.585011 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a71e1f5-866b-4034-808f-878e473fbbc3" path="/var/lib/kubelet/pods/4a71e1f5-866b-4034-808f-878e473fbbc3/volumes" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.653656 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-42rkg"] Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.656266 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.669185 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42rkg"] Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.766428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wbhc\" (UniqueName: \"kubernetes.io/projected/686924f9-74b5-48e9-b757-96993e31a0e6-kube-api-access-2wbhc\") pod \"community-operators-42rkg\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.766588 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-utilities\") pod \"community-operators-42rkg\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.766681 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-catalog-content\") pod \"community-operators-42rkg\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.868410 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-utilities\") pod \"community-operators-42rkg\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.868978 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-utilities\") pod \"community-operators-42rkg\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.869127 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-catalog-content\") pod \"community-operators-42rkg\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.869221 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-catalog-content\") pod \"community-operators-42rkg\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.869381 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wbhc\" (UniqueName: \"kubernetes.io/projected/686924f9-74b5-48e9-b757-96993e31a0e6-kube-api-access-2wbhc\") pod \"community-operators-42rkg\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.907740 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wbhc\" (UniqueName: \"kubernetes.io/projected/686924f9-74b5-48e9-b757-96993e31a0e6-kube-api-access-2wbhc\") pod \"community-operators-42rkg\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:03 crc kubenswrapper[4763]: I1006 16:32:03.992835 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:04 crc kubenswrapper[4763]: I1006 16:32:04.664734 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42rkg"] Oct 06 16:32:05 crc kubenswrapper[4763]: I1006 16:32:05.135808 4763 generic.go:334] "Generic (PLEG): container finished" podID="686924f9-74b5-48e9-b757-96993e31a0e6" containerID="f633fe1a1b5cfd0ae2caa5ddfba5daab1cfaa53d556fb1e9ff05a430fee73ddc" exitCode=0 Oct 06 16:32:05 crc kubenswrapper[4763]: I1006 16:32:05.136352 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42rkg" event={"ID":"686924f9-74b5-48e9-b757-96993e31a0e6","Type":"ContainerDied","Data":"f633fe1a1b5cfd0ae2caa5ddfba5daab1cfaa53d556fb1e9ff05a430fee73ddc"} Oct 06 16:32:05 crc kubenswrapper[4763]: I1006 16:32:05.136386 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42rkg" event={"ID":"686924f9-74b5-48e9-b757-96993e31a0e6","Type":"ContainerStarted","Data":"6f573bc3f82a3c4fa006f4c7c0738f95c9463bf2ab5a5460bbda9e7d1219c7c3"} Oct 06 16:32:06 crc kubenswrapper[4763]: I1006 16:32:06.146768 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42rkg" event={"ID":"686924f9-74b5-48e9-b757-96993e31a0e6","Type":"ContainerStarted","Data":"d601e716b73422cf1f131f6f4cdc573079d2e7334f20dd48a755d476abd9740e"} Oct 06 16:32:07 crc kubenswrapper[4763]: I1006 16:32:07.158046 4763 generic.go:334] "Generic (PLEG): container finished" podID="686924f9-74b5-48e9-b757-96993e31a0e6" containerID="d601e716b73422cf1f131f6f4cdc573079d2e7334f20dd48a755d476abd9740e" exitCode=0 Oct 06 16:32:07 crc kubenswrapper[4763]: I1006 16:32:07.158171 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42rkg" event={"ID":"686924f9-74b5-48e9-b757-96993e31a0e6","Type":"ContainerDied","Data":"d601e716b73422cf1f131f6f4cdc573079d2e7334f20dd48a755d476abd9740e"} Oct 06 16:32:08 crc kubenswrapper[4763]: I1006 16:32:08.169797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42rkg" event={"ID":"686924f9-74b5-48e9-b757-96993e31a0e6","Type":"ContainerStarted","Data":"4c0c52571bf03e0d2b324abf4fbfd0fd3c99c413356e9439a79bd0e0902b1a8f"} Oct 06 16:32:08 crc kubenswrapper[4763]: I1006 16:32:08.188350 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-42rkg" podStartSLOduration=2.726756633 podStartE2EDuration="5.188332352s" podCreationTimestamp="2025-10-06 16:32:03 +0000 UTC" firstStartedPulling="2025-10-06 16:32:05.140022669 +0000 UTC m=+5922.295315191" lastFinishedPulling="2025-10-06 16:32:07.601598398 +0000 UTC m=+5924.756890910" observedRunningTime="2025-10-06 16:32:08.188003343 +0000 UTC m=+5925.343295855" watchObservedRunningTime="2025-10-06 16:32:08.188332352 +0000 UTC m=+5925.343624864" Oct 06 16:32:13 crc kubenswrapper[4763]: I1006 16:32:13.993138 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:13 crc kubenswrapper[4763]: I1006 16:32:13.995369 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:14 crc kubenswrapper[4763]: I1006 16:32:14.072186 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:14 crc kubenswrapper[4763]: I1006 16:32:14.314664 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.048797 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42rkg"] Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.049766 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-42rkg" podUID="686924f9-74b5-48e9-b757-96993e31a0e6" containerName="registry-server" containerID="cri-o://4c0c52571bf03e0d2b324abf4fbfd0fd3c99c413356e9439a79bd0e0902b1a8f" gracePeriod=2 Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.268265 4763 generic.go:334] "Generic (PLEG): container finished" podID="686924f9-74b5-48e9-b757-96993e31a0e6" containerID="4c0c52571bf03e0d2b324abf4fbfd0fd3c99c413356e9439a79bd0e0902b1a8f" exitCode=0 Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.268316 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42rkg" event={"ID":"686924f9-74b5-48e9-b757-96993e31a0e6","Type":"ContainerDied","Data":"4c0c52571bf03e0d2b324abf4fbfd0fd3c99c413356e9439a79bd0e0902b1a8f"} Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.577341 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.661528 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wbhc\" (UniqueName: \"kubernetes.io/projected/686924f9-74b5-48e9-b757-96993e31a0e6-kube-api-access-2wbhc\") pod \"686924f9-74b5-48e9-b757-96993e31a0e6\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.661741 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-catalog-content\") pod \"686924f9-74b5-48e9-b757-96993e31a0e6\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.661773 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-utilities\") pod \"686924f9-74b5-48e9-b757-96993e31a0e6\" (UID: \"686924f9-74b5-48e9-b757-96993e31a0e6\") " Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.663272 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-utilities" (OuterVolumeSpecName: "utilities") pod "686924f9-74b5-48e9-b757-96993e31a0e6" (UID: "686924f9-74b5-48e9-b757-96993e31a0e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.668652 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/686924f9-74b5-48e9-b757-96993e31a0e6-kube-api-access-2wbhc" (OuterVolumeSpecName: "kube-api-access-2wbhc") pod "686924f9-74b5-48e9-b757-96993e31a0e6" (UID: "686924f9-74b5-48e9-b757-96993e31a0e6"). InnerVolumeSpecName "kube-api-access-2wbhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.706634 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "686924f9-74b5-48e9-b757-96993e31a0e6" (UID: "686924f9-74b5-48e9-b757-96993e31a0e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.764398 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wbhc\" (UniqueName: \"kubernetes.io/projected/686924f9-74b5-48e9-b757-96993e31a0e6-kube-api-access-2wbhc\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.764430 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:17 crc kubenswrapper[4763]: I1006 16:32:17.764439 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686924f9-74b5-48e9-b757-96993e31a0e6-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:18 crc kubenswrapper[4763]: I1006 16:32:18.284360 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42rkg" event={"ID":"686924f9-74b5-48e9-b757-96993e31a0e6","Type":"ContainerDied","Data":"6f573bc3f82a3c4fa006f4c7c0738f95c9463bf2ab5a5460bbda9e7d1219c7c3"} Oct 06 16:32:18 crc kubenswrapper[4763]: I1006 16:32:18.284480 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42rkg" Oct 06 16:32:18 crc kubenswrapper[4763]: I1006 16:32:18.286441 4763 scope.go:117] "RemoveContainer" containerID="4c0c52571bf03e0d2b324abf4fbfd0fd3c99c413356e9439a79bd0e0902b1a8f" Oct 06 16:32:18 crc kubenswrapper[4763]: I1006 16:32:18.317395 4763 scope.go:117] "RemoveContainer" containerID="d601e716b73422cf1f131f6f4cdc573079d2e7334f20dd48a755d476abd9740e" Oct 06 16:32:18 crc kubenswrapper[4763]: I1006 16:32:18.348950 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42rkg"] Oct 06 16:32:18 crc kubenswrapper[4763]: I1006 16:32:18.356489 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-42rkg"] Oct 06 16:32:18 crc kubenswrapper[4763]: I1006 16:32:18.360345 4763 scope.go:117] "RemoveContainer" containerID="f633fe1a1b5cfd0ae2caa5ddfba5daab1cfaa53d556fb1e9ff05a430fee73ddc" Oct 06 16:32:19 crc kubenswrapper[4763]: I1006 16:32:19.587963 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="686924f9-74b5-48e9-b757-96993e31a0e6" path="/var/lib/kubelet/pods/686924f9-74b5-48e9-b757-96993e31a0e6/volumes" Oct 06 16:32:31 crc kubenswrapper[4763]: I1006 16:32:31.149041 4763 scope.go:117] "RemoveContainer" containerID="243d4b03a3e8dd986745b8e2717056948bae1c9aa97fbe23e30d58d41c8eed2d" Oct 06 16:32:31 crc kubenswrapper[4763]: I1006 16:32:31.202570 4763 scope.go:117] "RemoveContainer" containerID="986a79dc1cdfd45a05e6879a8f8f967ea7acf4a6abe1a1f33b258cf415cbf8d3" Oct 06 16:32:31 crc kubenswrapper[4763]: I1006 16:32:31.242957 4763 scope.go:117] "RemoveContainer" containerID="462c3d4b549efd1e59fb3371977791ca076e53388f28fca8b2950c6e916d5077" Oct 06 16:32:33 crc kubenswrapper[4763]: I1006 16:32:33.876389 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:32:33 crc kubenswrapper[4763]: I1006 16:32:33.876899 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.468906 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79d47c4685-dmtl6"] Oct 06 16:32:37 crc kubenswrapper[4763]: E1006 16:32:37.469718 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686924f9-74b5-48e9-b757-96993e31a0e6" containerName="extract-content" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.469731 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="686924f9-74b5-48e9-b757-96993e31a0e6" containerName="extract-content" Oct 06 16:32:37 crc kubenswrapper[4763]: E1006 16:32:37.469745 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686924f9-74b5-48e9-b757-96993e31a0e6" containerName="extract-utilities" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.469752 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="686924f9-74b5-48e9-b757-96993e31a0e6" containerName="extract-utilities" Oct 06 16:32:37 crc kubenswrapper[4763]: E1006 16:32:37.469767 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686924f9-74b5-48e9-b757-96993e31a0e6" containerName="registry-server" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.469773 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="686924f9-74b5-48e9-b757-96993e31a0e6" containerName="registry-server" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.470093 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="686924f9-74b5-48e9-b757-96993e31a0e6" containerName="registry-server" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.471201 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.474252 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.474589 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.474773 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.474910 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-qkrdz" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.483466 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79d47c4685-dmtl6"] Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.534784 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.535009 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" containerName="glance-log" containerID="cri-o://bd0a0b7bdfe3d5fe79478b17b35b7be0828ceec6f7ae720249005a4222a01523" gracePeriod=30 Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.535411 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" containerName="glance-httpd" containerID="cri-o://406287c65797cd2ff0595db4b85e18a58994679e45b67a790eaa03a26cc43710" gracePeriod=30 Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.583850 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39cbe76c-2dda-46a6-b489-6381d73930af-logs\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.583926 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-config-data\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.583972 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-scripts\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.584012 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39cbe76c-2dda-46a6-b489-6381d73930af-horizon-secret-key\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.584041 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v9tx\" (UniqueName: \"kubernetes.io/projected/39cbe76c-2dda-46a6-b489-6381d73930af-kube-api-access-2v9tx\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.595275 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6fbd8b6749-rs2fw"] Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.596902 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fbd8b6749-rs2fw"] Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.596985 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.639828 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.640251 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e75a2745-5aa0-47fe-8225-fadc3de8d130" containerName="glance-log" containerID="cri-o://fb17470f297a8aaaa2af829c7843e1eff33b230a533c5629817531f801d5eb13" gracePeriod=30 Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.640390 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e75a2745-5aa0-47fe-8225-fadc3de8d130" containerName="glance-httpd" containerID="cri-o://170a8f5ca5e2bdeb9ac21f65a289f63bf5745d0c14eb40adebf600d5d8afcca3" gracePeriod=30 Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.686158 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39cbe76c-2dda-46a6-b489-6381d73930af-logs\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.686213 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ff90281-f3f7-40a1-87ce-50af05900b74-horizon-secret-key\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.686249 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-config-data\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.686266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-config-data\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.686308 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-scripts\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.686334 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slmwl\" (UniqueName: \"kubernetes.io/projected/6ff90281-f3f7-40a1-87ce-50af05900b74-kube-api-access-slmwl\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.686354 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ff90281-f3f7-40a1-87ce-50af05900b74-logs\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.686375 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39cbe76c-2dda-46a6-b489-6381d73930af-horizon-secret-key\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.686400 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v9tx\" (UniqueName: \"kubernetes.io/projected/39cbe76c-2dda-46a6-b489-6381d73930af-kube-api-access-2v9tx\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.686432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-scripts\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.687170 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39cbe76c-2dda-46a6-b489-6381d73930af-logs\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.687675 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-scripts\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.688083 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-config-data\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.691872 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39cbe76c-2dda-46a6-b489-6381d73930af-horizon-secret-key\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.705062 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v9tx\" (UniqueName: \"kubernetes.io/projected/39cbe76c-2dda-46a6-b489-6381d73930af-kube-api-access-2v9tx\") pod \"horizon-79d47c4685-dmtl6\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.789580 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slmwl\" (UniqueName: \"kubernetes.io/projected/6ff90281-f3f7-40a1-87ce-50af05900b74-kube-api-access-slmwl\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.789664 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ff90281-f3f7-40a1-87ce-50af05900b74-logs\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.789760 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-scripts\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.789936 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ff90281-f3f7-40a1-87ce-50af05900b74-horizon-secret-key\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.789988 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-config-data\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.791558 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-config-data\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.791682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ff90281-f3f7-40a1-87ce-50af05900b74-logs\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.792106 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-scripts\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.792325 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.803165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ff90281-f3f7-40a1-87ce-50af05900b74-horizon-secret-key\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.809315 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slmwl\" (UniqueName: \"kubernetes.io/projected/6ff90281-f3f7-40a1-87ce-50af05900b74-kube-api-access-slmwl\") pod \"horizon-6fbd8b6749-rs2fw\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:37 crc kubenswrapper[4763]: I1006 16:32:37.939066 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.126381 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fbd8b6749-rs2fw"] Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.169975 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c9cf99f6f-v9tb6"] Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.174889 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.207323 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c9cf99f6f-v9tb6"] Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.234952 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79d47c4685-dmtl6"] Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.236546 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.316448 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-logs\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.316533 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-config-data\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.316738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-horizon-secret-key\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.316802 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-scripts\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.316900 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2qc4\" (UniqueName: \"kubernetes.io/projected/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-kube-api-access-n2qc4\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.418777 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-logs\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.419124 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-config-data\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.419188 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-horizon-secret-key\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.419214 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-scripts\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.419278 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2qc4\" (UniqueName: \"kubernetes.io/projected/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-kube-api-access-n2qc4\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.419851 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-logs\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.420661 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-config-data\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.422564 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-scripts\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.425853 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-horizon-secret-key\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.435056 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2qc4\" (UniqueName: \"kubernetes.io/projected/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-kube-api-access-n2qc4\") pod \"horizon-c9cf99f6f-v9tb6\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.520329 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.540556 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d47c4685-dmtl6" event={"ID":"39cbe76c-2dda-46a6-b489-6381d73930af","Type":"ContainerStarted","Data":"9b3d6f4364d6c474b7a41bc2504ebbac3fd73d58830d63f960de0e7f71d9c4cc"} Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.544128 4763 generic.go:334] "Generic (PLEG): container finished" podID="e75a2745-5aa0-47fe-8225-fadc3de8d130" containerID="fb17470f297a8aaaa2af829c7843e1eff33b230a533c5629817531f801d5eb13" exitCode=143 Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.544243 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e75a2745-5aa0-47fe-8225-fadc3de8d130","Type":"ContainerDied","Data":"fb17470f297a8aaaa2af829c7843e1eff33b230a533c5629817531f801d5eb13"} Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.546462 4763 generic.go:334] "Generic (PLEG): container finished" podID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" containerID="bd0a0b7bdfe3d5fe79478b17b35b7be0828ceec6f7ae720249005a4222a01523" exitCode=143 Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.546500 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4","Type":"ContainerDied","Data":"bd0a0b7bdfe3d5fe79478b17b35b7be0828ceec6f7ae720249005a4222a01523"} Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.566391 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fbd8b6749-rs2fw"] Oct 06 16:32:38 crc kubenswrapper[4763]: W1006 16:32:38.568850 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ff90281_f3f7_40a1_87ce_50af05900b74.slice/crio-bd91d897ceb978dcb3fb49694c8ae053b51344b063ca4d4ce44aa989b675737b WatchSource:0}: Error finding container bd91d897ceb978dcb3fb49694c8ae053b51344b063ca4d4ce44aa989b675737b: Status 404 returned error can't find the container with id bd91d897ceb978dcb3fb49694c8ae053b51344b063ca4d4ce44aa989b675737b Oct 06 16:32:38 crc kubenswrapper[4763]: I1006 16:32:38.969925 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c9cf99f6f-v9tb6"] Oct 06 16:32:38 crc kubenswrapper[4763]: W1006 16:32:38.981653 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded85b17e_a95b_48e1_9cb3_afec958e3ecd.slice/crio-7bb1cb7a0e1459c528a2417315f2e26d3223b22ae5c9392b6bdeb6992701bb18 WatchSource:0}: Error finding container 7bb1cb7a0e1459c528a2417315f2e26d3223b22ae5c9392b6bdeb6992701bb18: Status 404 returned error can't find the container with id 7bb1cb7a0e1459c528a2417315f2e26d3223b22ae5c9392b6bdeb6992701bb18 Oct 06 16:32:39 crc kubenswrapper[4763]: I1006 16:32:39.556551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c9cf99f6f-v9tb6" event={"ID":"ed85b17e-a95b-48e1-9cb3-afec958e3ecd","Type":"ContainerStarted","Data":"7bb1cb7a0e1459c528a2417315f2e26d3223b22ae5c9392b6bdeb6992701bb18"} Oct 06 16:32:39 crc kubenswrapper[4763]: I1006 16:32:39.558101 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbd8b6749-rs2fw" event={"ID":"6ff90281-f3f7-40a1-87ce-50af05900b74","Type":"ContainerStarted","Data":"bd91d897ceb978dcb3fb49694c8ae053b51344b063ca4d4ce44aa989b675737b"} Oct 06 16:32:41 crc kubenswrapper[4763]: I1006 16:32:41.049268 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lsp4w"] Oct 06 16:32:41 crc kubenswrapper[4763]: I1006 16:32:41.060453 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lsp4w"] Oct 06 16:32:41 crc kubenswrapper[4763]: I1006 16:32:41.588790 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753794e2-1eed-49c0-81d0-b684b31e1986" path="/var/lib/kubelet/pods/753794e2-1eed-49c0-81d0-b684b31e1986/volumes" Oct 06 16:32:41 crc kubenswrapper[4763]: I1006 16:32:41.590581 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4","Type":"ContainerDied","Data":"406287c65797cd2ff0595db4b85e18a58994679e45b67a790eaa03a26cc43710"} Oct 06 16:32:41 crc kubenswrapper[4763]: I1006 16:32:41.590566 4763 generic.go:334] "Generic (PLEG): container finished" podID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" containerID="406287c65797cd2ff0595db4b85e18a58994679e45b67a790eaa03a26cc43710" exitCode=0 Oct 06 16:32:41 crc kubenswrapper[4763]: I1006 16:32:41.594828 4763 generic.go:334] "Generic (PLEG): container finished" podID="e75a2745-5aa0-47fe-8225-fadc3de8d130" containerID="170a8f5ca5e2bdeb9ac21f65a289f63bf5745d0c14eb40adebf600d5d8afcca3" exitCode=0 Oct 06 16:32:41 crc kubenswrapper[4763]: I1006 16:32:41.594870 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e75a2745-5aa0-47fe-8225-fadc3de8d130","Type":"ContainerDied","Data":"170a8f5ca5e2bdeb9ac21f65a289f63bf5745d0c14eb40adebf600d5d8afcca3"} Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.352217 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.366598 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd8q7\" (UniqueName: \"kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-kube-api-access-cd8q7\") pod \"e75a2745-5aa0-47fe-8225-fadc3de8d130\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.366730 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-httpd-run\") pod \"e75a2745-5aa0-47fe-8225-fadc3de8d130\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.366756 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-logs\") pod \"e75a2745-5aa0-47fe-8225-fadc3de8d130\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.366882 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-ceph\") pod \"e75a2745-5aa0-47fe-8225-fadc3de8d130\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.366917 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-combined-ca-bundle\") pod \"e75a2745-5aa0-47fe-8225-fadc3de8d130\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.366953 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-scripts\") pod \"e75a2745-5aa0-47fe-8225-fadc3de8d130\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.367024 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-config-data\") pod \"e75a2745-5aa0-47fe-8225-fadc3de8d130\" (UID: \"e75a2745-5aa0-47fe-8225-fadc3de8d130\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.368481 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-logs" (OuterVolumeSpecName: "logs") pod "e75a2745-5aa0-47fe-8225-fadc3de8d130" (UID: "e75a2745-5aa0-47fe-8225-fadc3de8d130"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.373210 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-scripts" (OuterVolumeSpecName: "scripts") pod "e75a2745-5aa0-47fe-8225-fadc3de8d130" (UID: "e75a2745-5aa0-47fe-8225-fadc3de8d130"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.372436 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-ceph" (OuterVolumeSpecName: "ceph") pod "e75a2745-5aa0-47fe-8225-fadc3de8d130" (UID: "e75a2745-5aa0-47fe-8225-fadc3de8d130"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.375887 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e75a2745-5aa0-47fe-8225-fadc3de8d130" (UID: "e75a2745-5aa0-47fe-8225-fadc3de8d130"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.380521 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-kube-api-access-cd8q7" (OuterVolumeSpecName: "kube-api-access-cd8q7") pod "e75a2745-5aa0-47fe-8225-fadc3de8d130" (UID: "e75a2745-5aa0-47fe-8225-fadc3de8d130"). InnerVolumeSpecName "kube-api-access-cd8q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.451057 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.474368 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-httpd-run\") pod \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.474423 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-scripts\") pod \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.474491 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-ceph\") pod \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.474558 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-combined-ca-bundle\") pod \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.474601 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56s2b\" (UniqueName: \"kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-kube-api-access-56s2b\") pod \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.474664 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-logs\") pod \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.474801 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-config-data\") pod \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\" (UID: \"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4\") " Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.475341 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd8q7\" (UniqueName: \"kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-kube-api-access-cd8q7\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.475367 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.475380 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75a2745-5aa0-47fe-8225-fadc3de8d130-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.475390 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e75a2745-5aa0-47fe-8225-fadc3de8d130-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.475402 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.477390 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" (UID: "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.479368 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-logs" (OuterVolumeSpecName: "logs") pod "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" (UID: "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.487118 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-ceph" (OuterVolumeSpecName: "ceph") pod "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" (UID: "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.487936 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-scripts" (OuterVolumeSpecName: "scripts") pod "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" (UID: "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.489952 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-kube-api-access-56s2b" (OuterVolumeSpecName: "kube-api-access-56s2b") pod "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" (UID: "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4"). InnerVolumeSpecName "kube-api-access-56s2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.505092 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e75a2745-5aa0-47fe-8225-fadc3de8d130" (UID: "e75a2745-5aa0-47fe-8225-fadc3de8d130"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.577087 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56s2b\" (UniqueName: \"kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-kube-api-access-56s2b\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.577374 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.577518 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.577542 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.577558 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.577570 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-ceph\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.585244 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-config-data" (OuterVolumeSpecName: "config-data") pod "e75a2745-5aa0-47fe-8225-fadc3de8d130" (UID: "e75a2745-5aa0-47fe-8225-fadc3de8d130"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.599116 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" (UID: "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.617184 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-config-data" (OuterVolumeSpecName: "config-data") pod "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" (UID: "3b1d4c0e-3603-4958-a2ee-eb06eff5bee4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.634863 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.634876 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3b1d4c0e-3603-4958-a2ee-eb06eff5bee4","Type":"ContainerDied","Data":"bcf60ac450010b7501caed4d526a1db7cbb21645f55256cc4522594583a8722e"} Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.634950 4763 scope.go:117] "RemoveContainer" containerID="406287c65797cd2ff0595db4b85e18a58994679e45b67a790eaa03a26cc43710" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.641585 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d47c4685-dmtl6" event={"ID":"39cbe76c-2dda-46a6-b489-6381d73930af","Type":"ContainerStarted","Data":"74e03879e51ab578aca993d5847a42766184918f2034ad54a95cb19979ff376f"} Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.641674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d47c4685-dmtl6" event={"ID":"39cbe76c-2dda-46a6-b489-6381d73930af","Type":"ContainerStarted","Data":"ed46cba80662fa2d77c188ff513ec706b6e2638afeb42bb7c460f75b7ee64392"} Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.648112 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c9cf99f6f-v9tb6" event={"ID":"ed85b17e-a95b-48e1-9cb3-afec958e3ecd","Type":"ContainerStarted","Data":"f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8"} Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.648148 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c9cf99f6f-v9tb6" event={"ID":"ed85b17e-a95b-48e1-9cb3-afec958e3ecd","Type":"ContainerStarted","Data":"839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5"} Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.652351 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbd8b6749-rs2fw" event={"ID":"6ff90281-f3f7-40a1-87ce-50af05900b74","Type":"ContainerStarted","Data":"0afaff7d15c4ad05b200e20e307011a80509ee281cd111d89fadedabef6bb195"} Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.652377 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbd8b6749-rs2fw" event={"ID":"6ff90281-f3f7-40a1-87ce-50af05900b74","Type":"ContainerStarted","Data":"72c83d9ba8ccd313e4d907a47923246a78c6f864ce8f6a0f036cf88f1ea12ac2"} Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.652465 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fbd8b6749-rs2fw" podUID="6ff90281-f3f7-40a1-87ce-50af05900b74" containerName="horizon-log" containerID="cri-o://72c83d9ba8ccd313e4d907a47923246a78c6f864ce8f6a0f036cf88f1ea12ac2" gracePeriod=30 Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.652732 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fbd8b6749-rs2fw" podUID="6ff90281-f3f7-40a1-87ce-50af05900b74" containerName="horizon" containerID="cri-o://0afaff7d15c4ad05b200e20e307011a80509ee281cd111d89fadedabef6bb195" gracePeriod=30 Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.661914 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.661829 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e75a2745-5aa0-47fe-8225-fadc3de8d130","Type":"ContainerDied","Data":"f0cd1ba86a26e7b2219f111af653c3243d3bc454ec032020821d0a34dcc556bb"} Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.672431 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79d47c4685-dmtl6" podStartSLOduration=1.942995566 podStartE2EDuration="8.672411167s" podCreationTimestamp="2025-10-06 16:32:37 +0000 UTC" firstStartedPulling="2025-10-06 16:32:38.23634303 +0000 UTC m=+5955.391635542" lastFinishedPulling="2025-10-06 16:32:44.965758631 +0000 UTC m=+5962.121051143" observedRunningTime="2025-10-06 16:32:45.670950497 +0000 UTC m=+5962.826243059" watchObservedRunningTime="2025-10-06 16:32:45.672411167 +0000 UTC m=+5962.827703679" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.688640 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.688698 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75a2745-5aa0-47fe-8225-fadc3de8d130-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.688716 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.691008 4763 scope.go:117] "RemoveContainer" containerID="bd0a0b7bdfe3d5fe79478b17b35b7be0828ceec6f7ae720249005a4222a01523" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.724846 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6fbd8b6749-rs2fw" podStartSLOduration=2.328665643 podStartE2EDuration="8.72481491s" podCreationTimestamp="2025-10-06 16:32:37 +0000 UTC" firstStartedPulling="2025-10-06 16:32:38.571014462 +0000 UTC m=+5955.726306974" lastFinishedPulling="2025-10-06 16:32:44.967163729 +0000 UTC m=+5962.122456241" observedRunningTime="2025-10-06 16:32:45.694541958 +0000 UTC m=+5962.849834490" watchObservedRunningTime="2025-10-06 16:32:45.72481491 +0000 UTC m=+5962.880107422" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.726570 4763 scope.go:117] "RemoveContainer" containerID="170a8f5ca5e2bdeb9ac21f65a289f63bf5745d0c14eb40adebf600d5d8afcca3" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.774829 4763 scope.go:117] "RemoveContainer" containerID="fb17470f297a8aaaa2af829c7843e1eff33b230a533c5629817531f801d5eb13" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.779010 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.791101 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.803135 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:32:45 crc kubenswrapper[4763]: E1006 16:32:45.803702 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" containerName="glance-log" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.803728 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" containerName="glance-log" Oct 06 16:32:45 crc kubenswrapper[4763]: E1006 16:32:45.803775 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75a2745-5aa0-47fe-8225-fadc3de8d130" containerName="glance-log" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.803786 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75a2745-5aa0-47fe-8225-fadc3de8d130" containerName="glance-log" Oct 06 16:32:45 crc kubenswrapper[4763]: E1006 16:32:45.803817 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75a2745-5aa0-47fe-8225-fadc3de8d130" containerName="glance-httpd" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.803826 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75a2745-5aa0-47fe-8225-fadc3de8d130" containerName="glance-httpd" Oct 06 16:32:45 crc kubenswrapper[4763]: E1006 16:32:45.803848 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" containerName="glance-httpd" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.803855 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" containerName="glance-httpd" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.804212 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75a2745-5aa0-47fe-8225-fadc3de8d130" containerName="glance-log" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.804948 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75a2745-5aa0-47fe-8225-fadc3de8d130" containerName="glance-httpd" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.804980 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" containerName="glance-httpd" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.805031 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" containerName="glance-log" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.805956 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-c9cf99f6f-v9tb6" podStartSLOduration=1.884025421 podStartE2EDuration="7.805936294s" podCreationTimestamp="2025-10-06 16:32:38 +0000 UTC" firstStartedPulling="2025-10-06 16:32:38.988546945 +0000 UTC m=+5956.143839457" lastFinishedPulling="2025-10-06 16:32:44.910457808 +0000 UTC m=+5962.065750330" observedRunningTime="2025-10-06 16:32:45.733973469 +0000 UTC m=+5962.889265981" watchObservedRunningTime="2025-10-06 16:32:45.805936294 +0000 UTC m=+5962.961228806" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.806544 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.808353 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-th44z" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.809032 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.809243 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.822376 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.836060 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.851598 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.864206 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.868963 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.871369 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.874785 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.897433 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/944eb40b-1e91-47ef-8568-b156709d0b97-logs\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.897506 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/944eb40b-1e91-47ef-8568-b156709d0b97-scripts\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.897564 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn2l7\" (UniqueName: \"kubernetes.io/projected/399f4bf6-d9f3-4550-af17-4c87ebc31e30-kube-api-access-tn2l7\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.897606 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944eb40b-1e91-47ef-8568-b156709d0b97-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.898831 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/944eb40b-1e91-47ef-8568-b156709d0b97-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.898877 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/399f4bf6-d9f3-4550-af17-4c87ebc31e30-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.898919 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399f4bf6-d9f3-4550-af17-4c87ebc31e30-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.899007 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/944eb40b-1e91-47ef-8568-b156709d0b97-ceph\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.899045 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399f4bf6-d9f3-4550-af17-4c87ebc31e30-config-data\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.899086 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944eb40b-1e91-47ef-8568-b156709d0b97-config-data\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.899228 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399f4bf6-d9f3-4550-af17-4c87ebc31e30-logs\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.899347 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399f4bf6-d9f3-4550-af17-4c87ebc31e30-scripts\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.899466 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/399f4bf6-d9f3-4550-af17-4c87ebc31e30-ceph\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:45 crc kubenswrapper[4763]: I1006 16:32:45.899503 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z62m\" (UniqueName: \"kubernetes.io/projected/944eb40b-1e91-47ef-8568-b156709d0b97-kube-api-access-8z62m\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.000856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/944eb40b-1e91-47ef-8568-b156709d0b97-scripts\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.000950 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn2l7\" (UniqueName: \"kubernetes.io/projected/399f4bf6-d9f3-4550-af17-4c87ebc31e30-kube-api-access-tn2l7\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.000995 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944eb40b-1e91-47ef-8568-b156709d0b97-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001051 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/944eb40b-1e91-47ef-8568-b156709d0b97-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001079 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/399f4bf6-d9f3-4550-af17-4c87ebc31e30-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001104 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399f4bf6-d9f3-4550-af17-4c87ebc31e30-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001133 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/944eb40b-1e91-47ef-8568-b156709d0b97-ceph\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001173 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399f4bf6-d9f3-4550-af17-4c87ebc31e30-config-data\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001200 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944eb40b-1e91-47ef-8568-b156709d0b97-config-data\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001230 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399f4bf6-d9f3-4550-af17-4c87ebc31e30-logs\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001274 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399f4bf6-d9f3-4550-af17-4c87ebc31e30-scripts\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001324 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/399f4bf6-d9f3-4550-af17-4c87ebc31e30-ceph\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001347 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z62m\" (UniqueName: \"kubernetes.io/projected/944eb40b-1e91-47ef-8568-b156709d0b97-kube-api-access-8z62m\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001374 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/944eb40b-1e91-47ef-8568-b156709d0b97-logs\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.001774 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399f4bf6-d9f3-4550-af17-4c87ebc31e30-logs\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.002056 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/944eb40b-1e91-47ef-8568-b156709d0b97-logs\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.002667 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/944eb40b-1e91-47ef-8568-b156709d0b97-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.003247 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/399f4bf6-d9f3-4550-af17-4c87ebc31e30-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.005357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/944eb40b-1e91-47ef-8568-b156709d0b97-scripts\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.005578 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399f4bf6-d9f3-4550-af17-4c87ebc31e30-scripts\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.005962 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/944eb40b-1e91-47ef-8568-b156709d0b97-ceph\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.005980 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399f4bf6-d9f3-4550-af17-4c87ebc31e30-config-data\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.007034 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399f4bf6-d9f3-4550-af17-4c87ebc31e30-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.007475 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944eb40b-1e91-47ef-8568-b156709d0b97-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.012270 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/399f4bf6-d9f3-4550-af17-4c87ebc31e30-ceph\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.014293 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944eb40b-1e91-47ef-8568-b156709d0b97-config-data\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.023508 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z62m\" (UniqueName: \"kubernetes.io/projected/944eb40b-1e91-47ef-8568-b156709d0b97-kube-api-access-8z62m\") pod \"glance-default-internal-api-0\" (UID: \"944eb40b-1e91-47ef-8568-b156709d0b97\") " pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.024478 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn2l7\" (UniqueName: \"kubernetes.io/projected/399f4bf6-d9f3-4550-af17-4c87ebc31e30-kube-api-access-tn2l7\") pod \"glance-default-external-api-0\" (UID: \"399f4bf6-d9f3-4550-af17-4c87ebc31e30\") " pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.195033 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.214374 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:46 crc kubenswrapper[4763]: I1006 16:32:46.811283 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 16:32:47 crc kubenswrapper[4763]: I1006 16:32:47.585997 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b1d4c0e-3603-4958-a2ee-eb06eff5bee4" path="/var/lib/kubelet/pods/3b1d4c0e-3603-4958-a2ee-eb06eff5bee4/volumes" Oct 06 16:32:47 crc kubenswrapper[4763]: I1006 16:32:47.587751 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75a2745-5aa0-47fe-8225-fadc3de8d130" path="/var/lib/kubelet/pods/e75a2745-5aa0-47fe-8225-fadc3de8d130/volumes" Oct 06 16:32:47 crc kubenswrapper[4763]: I1006 16:32:47.684708 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"944eb40b-1e91-47ef-8568-b156709d0b97","Type":"ContainerStarted","Data":"0bdbac7d9bf1818aac8d2b9b5bf2c204c1cd232584128719119f4e34bb14b1ec"} Oct 06 16:32:47 crc kubenswrapper[4763]: I1006 16:32:47.684747 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"944eb40b-1e91-47ef-8568-b156709d0b97","Type":"ContainerStarted","Data":"d7dbf4d3c202de21fbcd16849f24402159868005b57fe96f648bfa741a17876c"} Oct 06 16:32:47 crc kubenswrapper[4763]: I1006 16:32:47.775283 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 16:32:47 crc kubenswrapper[4763]: I1006 16:32:47.793257 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:47 crc kubenswrapper[4763]: I1006 16:32:47.793384 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:32:47 crc kubenswrapper[4763]: I1006 16:32:47.940367 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:32:48 crc kubenswrapper[4763]: I1006 16:32:48.521481 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:48 crc kubenswrapper[4763]: I1006 16:32:48.521542 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:32:48 crc kubenswrapper[4763]: I1006 16:32:48.701304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"944eb40b-1e91-47ef-8568-b156709d0b97","Type":"ContainerStarted","Data":"dbae4299f72ee763e2e7c86165eaffec861be99f494e0558a10a913ff798cae2"} Oct 06 16:32:48 crc kubenswrapper[4763]: I1006 16:32:48.707348 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"399f4bf6-d9f3-4550-af17-4c87ebc31e30","Type":"ContainerStarted","Data":"0fe19ee1626b84b34859aba4373dfce4c3fdedb53df5af5bc049dcaec987a087"} Oct 06 16:32:48 crc kubenswrapper[4763]: I1006 16:32:48.726555 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.726524594 podStartE2EDuration="3.726524594s" podCreationTimestamp="2025-10-06 16:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:32:48.724597931 +0000 UTC m=+5965.879890483" watchObservedRunningTime="2025-10-06 16:32:48.726524594 +0000 UTC m=+5965.881817146" Oct 06 16:32:49 crc kubenswrapper[4763]: I1006 16:32:49.716710 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"399f4bf6-d9f3-4550-af17-4c87ebc31e30","Type":"ContainerStarted","Data":"0849bc4a4494999b7ce18dc0c7eabf52bcbe6d6cf2d35a30c4163ecc59d14335"} Oct 06 16:32:49 crc kubenswrapper[4763]: I1006 16:32:49.717301 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"399f4bf6-d9f3-4550-af17-4c87ebc31e30","Type":"ContainerStarted","Data":"a779b0a8d163a7ac2446807e50128bd17d91d5970ee9786f9b8b1fef6a21ef30"} Oct 06 16:32:49 crc kubenswrapper[4763]: I1006 16:32:49.736542 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.736524671 podStartE2EDuration="4.736524671s" podCreationTimestamp="2025-10-06 16:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:32:49.732137452 +0000 UTC m=+5966.887429994" watchObservedRunningTime="2025-10-06 16:32:49.736524671 +0000 UTC m=+5966.891817183" Oct 06 16:32:50 crc kubenswrapper[4763]: I1006 16:32:50.041522 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5730-account-create-f2k7w"] Oct 06 16:32:50 crc kubenswrapper[4763]: I1006 16:32:50.050082 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5730-account-create-f2k7w"] Oct 06 16:32:51 crc kubenswrapper[4763]: I1006 16:32:51.592226 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="358ef27a-74d6-44d0-baf7-d8576b31fb47" path="/var/lib/kubelet/pods/358ef27a-74d6-44d0-baf7-d8576b31fb47/volumes" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.195251 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.195760 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.215685 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.215753 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.246220 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.246657 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.251919 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.264028 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.788808 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.789086 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.789096 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 16:32:56 crc kubenswrapper[4763]: I1006 16:32:56.789217 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 16:32:57 crc kubenswrapper[4763]: I1006 16:32:57.795237 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79d47c4685-dmtl6" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Oct 06 16:32:58 crc kubenswrapper[4763]: I1006 16:32:58.522133 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c9cf99f6f-v9tb6" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 06 16:32:58 crc kubenswrapper[4763]: I1006 16:32:58.723685 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:58 crc kubenswrapper[4763]: I1006 16:32:58.802042 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 16:32:58 crc kubenswrapper[4763]: I1006 16:32:58.802077 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 16:32:58 crc kubenswrapper[4763]: I1006 16:32:58.802060 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 16:32:58 crc kubenswrapper[4763]: I1006 16:32:58.831678 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 16:32:59 crc kubenswrapper[4763]: I1006 16:32:59.017365 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 16:32:59 crc kubenswrapper[4763]: I1006 16:32:59.039993 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qg9qj"] Oct 06 16:32:59 crc kubenswrapper[4763]: I1006 16:32:59.054111 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qg9qj"] Oct 06 16:32:59 crc kubenswrapper[4763]: I1006 16:32:59.413202 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 16:32:59 crc kubenswrapper[4763]: I1006 16:32:59.587853 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29e5a26-c005-4cdf-9483-131828c01169" path="/var/lib/kubelet/pods/b29e5a26-c005-4cdf-9483-131828c01169/volumes" Oct 06 16:33:03 crc kubenswrapper[4763]: I1006 16:33:03.876522 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:33:03 crc kubenswrapper[4763]: I1006 16:33:03.877634 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:33:09 crc kubenswrapper[4763]: I1006 16:33:09.844088 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:33:10 crc kubenswrapper[4763]: I1006 16:33:10.254599 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:33:11 crc kubenswrapper[4763]: I1006 16:33:11.444638 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:33:12 crc kubenswrapper[4763]: I1006 16:33:12.078928 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:33:12 crc kubenswrapper[4763]: I1006 16:33:12.139128 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79d47c4685-dmtl6"] Oct 06 16:33:12 crc kubenswrapper[4763]: I1006 16:33:12.139336 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79d47c4685-dmtl6" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon-log" containerID="cri-o://ed46cba80662fa2d77c188ff513ec706b6e2638afeb42bb7c460f75b7ee64392" gracePeriod=30 Oct 06 16:33:12 crc kubenswrapper[4763]: I1006 16:33:12.139486 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79d47c4685-dmtl6" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon" containerID="cri-o://74e03879e51ab578aca993d5847a42766184918f2034ad54a95cb19979ff376f" gracePeriod=30 Oct 06 16:33:15 crc kubenswrapper[4763]: I1006 16:33:15.995264 4763 generic.go:334] "Generic (PLEG): container finished" podID="39cbe76c-2dda-46a6-b489-6381d73930af" containerID="74e03879e51ab578aca993d5847a42766184918f2034ad54a95cb19979ff376f" exitCode=0 Oct 06 16:33:15 crc kubenswrapper[4763]: I1006 16:33:15.995412 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d47c4685-dmtl6" event={"ID":"39cbe76c-2dda-46a6-b489-6381d73930af","Type":"ContainerDied","Data":"74e03879e51ab578aca993d5847a42766184918f2034ad54a95cb19979ff376f"} Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:15.999102 4763 generic.go:334] "Generic (PLEG): container finished" podID="6ff90281-f3f7-40a1-87ce-50af05900b74" containerID="0afaff7d15c4ad05b200e20e307011a80509ee281cd111d89fadedabef6bb195" exitCode=137 Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:15.999120 4763 generic.go:334] "Generic (PLEG): container finished" podID="6ff90281-f3f7-40a1-87ce-50af05900b74" containerID="72c83d9ba8ccd313e4d907a47923246a78c6f864ce8f6a0f036cf88f1ea12ac2" exitCode=137 Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:15.999137 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbd8b6749-rs2fw" event={"ID":"6ff90281-f3f7-40a1-87ce-50af05900b74","Type":"ContainerDied","Data":"0afaff7d15c4ad05b200e20e307011a80509ee281cd111d89fadedabef6bb195"} Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:15.999198 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbd8b6749-rs2fw" event={"ID":"6ff90281-f3f7-40a1-87ce-50af05900b74","Type":"ContainerDied","Data":"72c83d9ba8ccd313e4d907a47923246a78c6f864ce8f6a0f036cf88f1ea12ac2"} Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.206084 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.289005 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slmwl\" (UniqueName: \"kubernetes.io/projected/6ff90281-f3f7-40a1-87ce-50af05900b74-kube-api-access-slmwl\") pod \"6ff90281-f3f7-40a1-87ce-50af05900b74\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.289083 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ff90281-f3f7-40a1-87ce-50af05900b74-horizon-secret-key\") pod \"6ff90281-f3f7-40a1-87ce-50af05900b74\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.289250 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-scripts\") pod \"6ff90281-f3f7-40a1-87ce-50af05900b74\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.289378 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ff90281-f3f7-40a1-87ce-50af05900b74-logs\") pod \"6ff90281-f3f7-40a1-87ce-50af05900b74\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.289422 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-config-data\") pod \"6ff90281-f3f7-40a1-87ce-50af05900b74\" (UID: \"6ff90281-f3f7-40a1-87ce-50af05900b74\") " Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.291444 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ff90281-f3f7-40a1-87ce-50af05900b74-logs" (OuterVolumeSpecName: "logs") pod "6ff90281-f3f7-40a1-87ce-50af05900b74" (UID: "6ff90281-f3f7-40a1-87ce-50af05900b74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.296089 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ff90281-f3f7-40a1-87ce-50af05900b74-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6ff90281-f3f7-40a1-87ce-50af05900b74" (UID: "6ff90281-f3f7-40a1-87ce-50af05900b74"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.296167 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ff90281-f3f7-40a1-87ce-50af05900b74-kube-api-access-slmwl" (OuterVolumeSpecName: "kube-api-access-slmwl") pod "6ff90281-f3f7-40a1-87ce-50af05900b74" (UID: "6ff90281-f3f7-40a1-87ce-50af05900b74"). InnerVolumeSpecName "kube-api-access-slmwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.325531 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-scripts" (OuterVolumeSpecName: "scripts") pod "6ff90281-f3f7-40a1-87ce-50af05900b74" (UID: "6ff90281-f3f7-40a1-87ce-50af05900b74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.326291 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-config-data" (OuterVolumeSpecName: "config-data") pod "6ff90281-f3f7-40a1-87ce-50af05900b74" (UID: "6ff90281-f3f7-40a1-87ce-50af05900b74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.391540 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.391793 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slmwl\" (UniqueName: \"kubernetes.io/projected/6ff90281-f3f7-40a1-87ce-50af05900b74-kube-api-access-slmwl\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.391869 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ff90281-f3f7-40a1-87ce-50af05900b74-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.391934 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff90281-f3f7-40a1-87ce-50af05900b74-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:16 crc kubenswrapper[4763]: I1006 16:33:16.391997 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ff90281-f3f7-40a1-87ce-50af05900b74-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:17 crc kubenswrapper[4763]: I1006 16:33:17.011879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbd8b6749-rs2fw" event={"ID":"6ff90281-f3f7-40a1-87ce-50af05900b74","Type":"ContainerDied","Data":"bd91d897ceb978dcb3fb49694c8ae053b51344b063ca4d4ce44aa989b675737b"} Oct 06 16:33:17 crc kubenswrapper[4763]: I1006 16:33:17.012316 4763 scope.go:117] "RemoveContainer" containerID="0afaff7d15c4ad05b200e20e307011a80509ee281cd111d89fadedabef6bb195" Oct 06 16:33:17 crc kubenswrapper[4763]: I1006 16:33:17.011924 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbd8b6749-rs2fw" Oct 06 16:33:17 crc kubenswrapper[4763]: I1006 16:33:17.097992 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fbd8b6749-rs2fw"] Oct 06 16:33:17 crc kubenswrapper[4763]: I1006 16:33:17.098079 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6fbd8b6749-rs2fw"] Oct 06 16:33:17 crc kubenswrapper[4763]: I1006 16:33:17.207086 4763 scope.go:117] "RemoveContainer" containerID="72c83d9ba8ccd313e4d907a47923246a78c6f864ce8f6a0f036cf88f1ea12ac2" Oct 06 16:33:17 crc kubenswrapper[4763]: I1006 16:33:17.585512 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ff90281-f3f7-40a1-87ce-50af05900b74" path="/var/lib/kubelet/pods/6ff90281-f3f7-40a1-87ce-50af05900b74/volumes" Oct 06 16:33:17 crc kubenswrapper[4763]: I1006 16:33:17.794435 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79d47c4685-dmtl6" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Oct 06 16:33:27 crc kubenswrapper[4763]: I1006 16:33:27.794398 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79d47c4685-dmtl6" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Oct 06 16:33:31 crc kubenswrapper[4763]: I1006 16:33:31.066008 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5jv9k"] Oct 06 16:33:31 crc kubenswrapper[4763]: I1006 16:33:31.098848 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5jv9k"] Oct 06 16:33:31 crc kubenswrapper[4763]: I1006 16:33:31.392399 4763 scope.go:117] "RemoveContainer" containerID="76f4c1ca7885bb77c9251f86fef173e79da477b24560f096aeb2a74b07aa4025" Oct 06 16:33:31 crc kubenswrapper[4763]: I1006 16:33:31.423337 4763 scope.go:117] "RemoveContainer" containerID="b4a652916d05bb44086cf712da3e1beeb22b83c95504b2a19efe93c4b3b29d03" Oct 06 16:33:31 crc kubenswrapper[4763]: I1006 16:33:31.502265 4763 scope.go:117] "RemoveContainer" containerID="2ec638e498b7adc69ad84f3c79770b08ed5166ab27dd7371f6cd6270c43aff50" Oct 06 16:33:31 crc kubenswrapper[4763]: I1006 16:33:31.595308 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8429c81-df56-4dd6-a297-85f6ec56a2ed" path="/var/lib/kubelet/pods/f8429c81-df56-4dd6-a297-85f6ec56a2ed/volumes" Oct 06 16:33:33 crc kubenswrapper[4763]: I1006 16:33:33.876398 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:33:33 crc kubenswrapper[4763]: I1006 16:33:33.876715 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:33:33 crc kubenswrapper[4763]: I1006 16:33:33.876789 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 16:33:33 crc kubenswrapper[4763]: I1006 16:33:33.877536 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 16:33:33 crc kubenswrapper[4763]: I1006 16:33:33.877603 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" gracePeriod=600 Oct 06 16:33:34 crc kubenswrapper[4763]: E1006 16:33:34.026668 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:33:34 crc kubenswrapper[4763]: I1006 16:33:34.216910 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" exitCode=0 Oct 06 16:33:34 crc kubenswrapper[4763]: I1006 16:33:34.216996 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428"} Oct 06 16:33:34 crc kubenswrapper[4763]: I1006 16:33:34.217608 4763 scope.go:117] "RemoveContainer" containerID="18239693b1bebb33d80ff60a66ab6865889de136729fdb80a6e4fbf4547b7272" Oct 06 16:33:34 crc kubenswrapper[4763]: I1006 16:33:34.218310 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:33:34 crc kubenswrapper[4763]: E1006 16:33:34.218781 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:33:37 crc kubenswrapper[4763]: I1006 16:33:37.794863 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79d47c4685-dmtl6" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Oct 06 16:33:37 crc kubenswrapper[4763]: I1006 16:33:37.795557 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:33:41 crc kubenswrapper[4763]: I1006 16:33:41.036932 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8826-account-create-kq9kx"] Oct 06 16:33:41 crc kubenswrapper[4763]: I1006 16:33:41.047762 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8826-account-create-kq9kx"] Oct 06 16:33:41 crc kubenswrapper[4763]: I1006 16:33:41.595064 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98cd77c1-a82d-4262-a70d-9bb52cac0667" path="/var/lib/kubelet/pods/98cd77c1-a82d-4262-a70d-9bb52cac0667/volumes" Oct 06 16:33:42 crc kubenswrapper[4763]: I1006 16:33:42.321316 4763 generic.go:334] "Generic (PLEG): container finished" podID="39cbe76c-2dda-46a6-b489-6381d73930af" containerID="ed46cba80662fa2d77c188ff513ec706b6e2638afeb42bb7c460f75b7ee64392" exitCode=137 Oct 06 16:33:42 crc kubenswrapper[4763]: I1006 16:33:42.321484 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d47c4685-dmtl6" event={"ID":"39cbe76c-2dda-46a6-b489-6381d73930af","Type":"ContainerDied","Data":"ed46cba80662fa2d77c188ff513ec706b6e2638afeb42bb7c460f75b7ee64392"} Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.637509 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.732417 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39cbe76c-2dda-46a6-b489-6381d73930af-horizon-secret-key\") pod \"39cbe76c-2dda-46a6-b489-6381d73930af\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.732508 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v9tx\" (UniqueName: \"kubernetes.io/projected/39cbe76c-2dda-46a6-b489-6381d73930af-kube-api-access-2v9tx\") pod \"39cbe76c-2dda-46a6-b489-6381d73930af\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.732549 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39cbe76c-2dda-46a6-b489-6381d73930af-logs\") pod \"39cbe76c-2dda-46a6-b489-6381d73930af\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.732707 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-scripts\") pod \"39cbe76c-2dda-46a6-b489-6381d73930af\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.732806 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-config-data\") pod \"39cbe76c-2dda-46a6-b489-6381d73930af\" (UID: \"39cbe76c-2dda-46a6-b489-6381d73930af\") " Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.732946 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39cbe76c-2dda-46a6-b489-6381d73930af-logs" (OuterVolumeSpecName: "logs") pod "39cbe76c-2dda-46a6-b489-6381d73930af" (UID: "39cbe76c-2dda-46a6-b489-6381d73930af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.733330 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39cbe76c-2dda-46a6-b489-6381d73930af-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.751637 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39cbe76c-2dda-46a6-b489-6381d73930af-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "39cbe76c-2dda-46a6-b489-6381d73930af" (UID: "39cbe76c-2dda-46a6-b489-6381d73930af"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.753564 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39cbe76c-2dda-46a6-b489-6381d73930af-kube-api-access-2v9tx" (OuterVolumeSpecName: "kube-api-access-2v9tx") pod "39cbe76c-2dda-46a6-b489-6381d73930af" (UID: "39cbe76c-2dda-46a6-b489-6381d73930af"). InnerVolumeSpecName "kube-api-access-2v9tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.762857 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-scripts" (OuterVolumeSpecName: "scripts") pod "39cbe76c-2dda-46a6-b489-6381d73930af" (UID: "39cbe76c-2dda-46a6-b489-6381d73930af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.763176 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-config-data" (OuterVolumeSpecName: "config-data") pod "39cbe76c-2dda-46a6-b489-6381d73930af" (UID: "39cbe76c-2dda-46a6-b489-6381d73930af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.835347 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/39cbe76c-2dda-46a6-b489-6381d73930af-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.835378 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v9tx\" (UniqueName: \"kubernetes.io/projected/39cbe76c-2dda-46a6-b489-6381d73930af-kube-api-access-2v9tx\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.835389 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:42.835397 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39cbe76c-2dda-46a6-b489-6381d73930af-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:43.332227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d47c4685-dmtl6" event={"ID":"39cbe76c-2dda-46a6-b489-6381d73930af","Type":"ContainerDied","Data":"9b3d6f4364d6c474b7a41bc2504ebbac3fd73d58830d63f960de0e7f71d9c4cc"} Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:43.332278 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79d47c4685-dmtl6" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:43.332287 4763 scope.go:117] "RemoveContainer" containerID="74e03879e51ab578aca993d5847a42766184918f2034ad54a95cb19979ff376f" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:43.373165 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79d47c4685-dmtl6"] Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:43.385758 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79d47c4685-dmtl6"] Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:43.514643 4763 scope.go:117] "RemoveContainer" containerID="ed46cba80662fa2d77c188ff513ec706b6e2638afeb42bb7c460f75b7ee64392" Oct 06 16:33:43 crc kubenswrapper[4763]: I1006 16:33:43.591130 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" path="/var/lib/kubelet/pods/39cbe76c-2dda-46a6-b489-6381d73930af/volumes" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.064487 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-v2jhg"] Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.074453 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-v2jhg"] Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.252773 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d5d79488f-hhjsh"] Oct 06 16:33:46 crc kubenswrapper[4763]: E1006 16:33:46.253250 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff90281-f3f7-40a1-87ce-50af05900b74" containerName="horizon" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.253273 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff90281-f3f7-40a1-87ce-50af05900b74" containerName="horizon" Oct 06 16:33:46 crc kubenswrapper[4763]: E1006 16:33:46.253288 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.253296 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon" Oct 06 16:33:46 crc kubenswrapper[4763]: E1006 16:33:46.253315 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon-log" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.253324 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon-log" Oct 06 16:33:46 crc kubenswrapper[4763]: E1006 16:33:46.253345 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff90281-f3f7-40a1-87ce-50af05900b74" containerName="horizon-log" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.253351 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff90281-f3f7-40a1-87ce-50af05900b74" containerName="horizon-log" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.253563 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff90281-f3f7-40a1-87ce-50af05900b74" containerName="horizon-log" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.253583 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon-log" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.253598 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff90281-f3f7-40a1-87ce-50af05900b74" containerName="horizon" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.253608 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="39cbe76c-2dda-46a6-b489-6381d73930af" containerName="horizon" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.254631 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.280115 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d5d79488f-hhjsh"] Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.313143 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ae65b4f-ac98-4355-9660-b0eebb531175-horizon-secret-key\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.313211 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ae65b4f-ac98-4355-9660-b0eebb531175-scripts\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.313252 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6t75\" (UniqueName: \"kubernetes.io/projected/3ae65b4f-ac98-4355-9660-b0eebb531175-kube-api-access-s6t75\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.313289 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae65b4f-ac98-4355-9660-b0eebb531175-logs\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.313403 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ae65b4f-ac98-4355-9660-b0eebb531175-config-data\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.414765 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ae65b4f-ac98-4355-9660-b0eebb531175-horizon-secret-key\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.414820 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ae65b4f-ac98-4355-9660-b0eebb531175-scripts\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.414850 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6t75\" (UniqueName: \"kubernetes.io/projected/3ae65b4f-ac98-4355-9660-b0eebb531175-kube-api-access-s6t75\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.414873 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae65b4f-ac98-4355-9660-b0eebb531175-logs\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.415009 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ae65b4f-ac98-4355-9660-b0eebb531175-config-data\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.415381 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae65b4f-ac98-4355-9660-b0eebb531175-logs\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.415851 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ae65b4f-ac98-4355-9660-b0eebb531175-scripts\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.416386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ae65b4f-ac98-4355-9660-b0eebb531175-config-data\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.420821 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ae65b4f-ac98-4355-9660-b0eebb531175-horizon-secret-key\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.431060 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6t75\" (UniqueName: \"kubernetes.io/projected/3ae65b4f-ac98-4355-9660-b0eebb531175-kube-api-access-s6t75\") pod \"horizon-5d5d79488f-hhjsh\" (UID: \"3ae65b4f-ac98-4355-9660-b0eebb531175\") " pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:46 crc kubenswrapper[4763]: I1006 16:33:46.583391 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:47 crc kubenswrapper[4763]: I1006 16:33:47.082245 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d5d79488f-hhjsh"] Oct 06 16:33:47 crc kubenswrapper[4763]: I1006 16:33:47.370154 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5d79488f-hhjsh" event={"ID":"3ae65b4f-ac98-4355-9660-b0eebb531175","Type":"ContainerStarted","Data":"5666cd82dca8e75d0b52493a6e3470c5c5708bd825b82bb55364b939108de58d"} Oct 06 16:33:47 crc kubenswrapper[4763]: I1006 16:33:47.577480 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:33:47 crc kubenswrapper[4763]: E1006 16:33:47.577828 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:33:47 crc kubenswrapper[4763]: I1006 16:33:47.594032 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88bc1ed1-051f-4afb-9ae1-aba8065452e2" path="/var/lib/kubelet/pods/88bc1ed1-051f-4afb-9ae1-aba8065452e2/volumes" Oct 06 16:33:47 crc kubenswrapper[4763]: I1006 16:33:47.716192 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-mm5f2"] Oct 06 16:33:47 crc kubenswrapper[4763]: I1006 16:33:47.717421 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mm5f2" Oct 06 16:33:47 crc kubenswrapper[4763]: I1006 16:33:47.727149 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-mm5f2"] Oct 06 16:33:47 crc kubenswrapper[4763]: I1006 16:33:47.845531 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4tmd\" (UniqueName: \"kubernetes.io/projected/ea093121-dc49-4bf5-aae2-0539f4f51467-kube-api-access-r4tmd\") pod \"heat-db-create-mm5f2\" (UID: \"ea093121-dc49-4bf5-aae2-0539f4f51467\") " pod="openstack/heat-db-create-mm5f2" Oct 06 16:33:47 crc kubenswrapper[4763]: I1006 16:33:47.948123 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4tmd\" (UniqueName: \"kubernetes.io/projected/ea093121-dc49-4bf5-aae2-0539f4f51467-kube-api-access-r4tmd\") pod \"heat-db-create-mm5f2\" (UID: \"ea093121-dc49-4bf5-aae2-0539f4f51467\") " pod="openstack/heat-db-create-mm5f2" Oct 06 16:33:47 crc kubenswrapper[4763]: I1006 16:33:47.977673 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4tmd\" (UniqueName: \"kubernetes.io/projected/ea093121-dc49-4bf5-aae2-0539f4f51467-kube-api-access-r4tmd\") pod \"heat-db-create-mm5f2\" (UID: \"ea093121-dc49-4bf5-aae2-0539f4f51467\") " pod="openstack/heat-db-create-mm5f2" Oct 06 16:33:48 crc kubenswrapper[4763]: I1006 16:33:48.055598 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mm5f2" Oct 06 16:33:48 crc kubenswrapper[4763]: I1006 16:33:48.382512 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5d79488f-hhjsh" event={"ID":"3ae65b4f-ac98-4355-9660-b0eebb531175","Type":"ContainerStarted","Data":"7c743d7f481bdfc48bfb4b48d53598211f2c94ebfe5f7c68663901f10974e463"} Oct 06 16:33:48 crc kubenswrapper[4763]: I1006 16:33:48.382909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5d79488f-hhjsh" event={"ID":"3ae65b4f-ac98-4355-9660-b0eebb531175","Type":"ContainerStarted","Data":"a3fb02cabe4157dca26c00a9b05a0a37bc7c60605442c417d798e41e4f91ca1a"} Oct 06 16:33:48 crc kubenswrapper[4763]: I1006 16:33:48.412303 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d5d79488f-hhjsh" podStartSLOduration=2.412278972 podStartE2EDuration="2.412278972s" podCreationTimestamp="2025-10-06 16:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:33:48.398099197 +0000 UTC m=+6025.553391709" watchObservedRunningTime="2025-10-06 16:33:48.412278972 +0000 UTC m=+6025.567571494" Oct 06 16:33:48 crc kubenswrapper[4763]: I1006 16:33:48.533777 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-mm5f2"] Oct 06 16:33:49 crc kubenswrapper[4763]: I1006 16:33:49.402518 4763 generic.go:334] "Generic (PLEG): container finished" podID="ea093121-dc49-4bf5-aae2-0539f4f51467" containerID="d7e1e25ea32ccf8006fb4080e271239019e26451a6add1cccb17fd368b09182e" exitCode=0 Oct 06 16:33:49 crc kubenswrapper[4763]: I1006 16:33:49.402864 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mm5f2" event={"ID":"ea093121-dc49-4bf5-aae2-0539f4f51467","Type":"ContainerDied","Data":"d7e1e25ea32ccf8006fb4080e271239019e26451a6add1cccb17fd368b09182e"} Oct 06 16:33:49 crc kubenswrapper[4763]: I1006 16:33:49.403309 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mm5f2" event={"ID":"ea093121-dc49-4bf5-aae2-0539f4f51467","Type":"ContainerStarted","Data":"64262f0aaae55c48908ed66e87815b278203dab9716835d3136e98707fb68cfd"} Oct 06 16:33:50 crc kubenswrapper[4763]: I1006 16:33:50.793652 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mm5f2" Oct 06 16:33:50 crc kubenswrapper[4763]: I1006 16:33:50.907997 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4tmd\" (UniqueName: \"kubernetes.io/projected/ea093121-dc49-4bf5-aae2-0539f4f51467-kube-api-access-r4tmd\") pod \"ea093121-dc49-4bf5-aae2-0539f4f51467\" (UID: \"ea093121-dc49-4bf5-aae2-0539f4f51467\") " Oct 06 16:33:50 crc kubenswrapper[4763]: I1006 16:33:50.920604 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea093121-dc49-4bf5-aae2-0539f4f51467-kube-api-access-r4tmd" (OuterVolumeSpecName: "kube-api-access-r4tmd") pod "ea093121-dc49-4bf5-aae2-0539f4f51467" (UID: "ea093121-dc49-4bf5-aae2-0539f4f51467"). InnerVolumeSpecName "kube-api-access-r4tmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:33:51 crc kubenswrapper[4763]: I1006 16:33:51.010028 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4tmd\" (UniqueName: \"kubernetes.io/projected/ea093121-dc49-4bf5-aae2-0539f4f51467-kube-api-access-r4tmd\") on node \"crc\" DevicePath \"\"" Oct 06 16:33:51 crc kubenswrapper[4763]: I1006 16:33:51.421921 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mm5f2" event={"ID":"ea093121-dc49-4bf5-aae2-0539f4f51467","Type":"ContainerDied","Data":"64262f0aaae55c48908ed66e87815b278203dab9716835d3136e98707fb68cfd"} Oct 06 16:33:51 crc kubenswrapper[4763]: I1006 16:33:51.422173 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64262f0aaae55c48908ed66e87815b278203dab9716835d3136e98707fb68cfd" Oct 06 16:33:51 crc kubenswrapper[4763]: I1006 16:33:51.422443 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mm5f2" Oct 06 16:33:56 crc kubenswrapper[4763]: I1006 16:33:56.584473 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:56 crc kubenswrapper[4763]: I1006 16:33:56.585393 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:33:57 crc kubenswrapper[4763]: I1006 16:33:57.708036 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-3e6e-account-create-86nsx"] Oct 06 16:33:57 crc kubenswrapper[4763]: E1006 16:33:57.708884 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea093121-dc49-4bf5-aae2-0539f4f51467" containerName="mariadb-database-create" Oct 06 16:33:57 crc kubenswrapper[4763]: I1006 16:33:57.708903 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea093121-dc49-4bf5-aae2-0539f4f51467" containerName="mariadb-database-create" Oct 06 16:33:57 crc kubenswrapper[4763]: I1006 16:33:57.709201 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea093121-dc49-4bf5-aae2-0539f4f51467" containerName="mariadb-database-create" Oct 06 16:33:57 crc kubenswrapper[4763]: I1006 16:33:57.722436 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3e6e-account-create-86nsx" Oct 06 16:33:57 crc kubenswrapper[4763]: I1006 16:33:57.732590 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 06 16:33:57 crc kubenswrapper[4763]: I1006 16:33:57.773832 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-3e6e-account-create-86nsx"] Oct 06 16:33:57 crc kubenswrapper[4763]: I1006 16:33:57.875809 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnflk\" (UniqueName: \"kubernetes.io/projected/5444e6ef-05c2-4c2d-b742-03edf03cb800-kube-api-access-pnflk\") pod \"heat-3e6e-account-create-86nsx\" (UID: \"5444e6ef-05c2-4c2d-b742-03edf03cb800\") " pod="openstack/heat-3e6e-account-create-86nsx" Oct 06 16:33:57 crc kubenswrapper[4763]: I1006 16:33:57.977483 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnflk\" (UniqueName: \"kubernetes.io/projected/5444e6ef-05c2-4c2d-b742-03edf03cb800-kube-api-access-pnflk\") pod \"heat-3e6e-account-create-86nsx\" (UID: \"5444e6ef-05c2-4c2d-b742-03edf03cb800\") " pod="openstack/heat-3e6e-account-create-86nsx" Oct 06 16:33:58 crc kubenswrapper[4763]: I1006 16:33:58.012424 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnflk\" (UniqueName: \"kubernetes.io/projected/5444e6ef-05c2-4c2d-b742-03edf03cb800-kube-api-access-pnflk\") pod \"heat-3e6e-account-create-86nsx\" (UID: \"5444e6ef-05c2-4c2d-b742-03edf03cb800\") " pod="openstack/heat-3e6e-account-create-86nsx" Oct 06 16:33:58 crc kubenswrapper[4763]: I1006 16:33:58.054236 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3e6e-account-create-86nsx" Oct 06 16:33:58 crc kubenswrapper[4763]: W1006 16:33:58.561108 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5444e6ef_05c2_4c2d_b742_03edf03cb800.slice/crio-f604b968b1f8091a3198f64272e7f4d57dd4b92fc9749382bb310a04322f66f8 WatchSource:0}: Error finding container f604b968b1f8091a3198f64272e7f4d57dd4b92fc9749382bb310a04322f66f8: Status 404 returned error can't find the container with id f604b968b1f8091a3198f64272e7f4d57dd4b92fc9749382bb310a04322f66f8 Oct 06 16:33:58 crc kubenswrapper[4763]: I1006 16:33:58.567885 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-3e6e-account-create-86nsx"] Oct 06 16:33:59 crc kubenswrapper[4763]: I1006 16:33:59.504818 4763 generic.go:334] "Generic (PLEG): container finished" podID="5444e6ef-05c2-4c2d-b742-03edf03cb800" containerID="22246f55807766136d966c83c08980810488df706ceae01aa2921e16f8edc1d5" exitCode=0 Oct 06 16:33:59 crc kubenswrapper[4763]: I1006 16:33:59.504938 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3e6e-account-create-86nsx" event={"ID":"5444e6ef-05c2-4c2d-b742-03edf03cb800","Type":"ContainerDied","Data":"22246f55807766136d966c83c08980810488df706ceae01aa2921e16f8edc1d5"} Oct 06 16:33:59 crc kubenswrapper[4763]: I1006 16:33:59.505086 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3e6e-account-create-86nsx" event={"ID":"5444e6ef-05c2-4c2d-b742-03edf03cb800","Type":"ContainerStarted","Data":"f604b968b1f8091a3198f64272e7f4d57dd4b92fc9749382bb310a04322f66f8"} Oct 06 16:33:59 crc kubenswrapper[4763]: I1006 16:33:59.576027 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:33:59 crc kubenswrapper[4763]: E1006 16:33:59.576296 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:34:00 crc kubenswrapper[4763]: I1006 16:34:00.888220 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3e6e-account-create-86nsx" Oct 06 16:34:01 crc kubenswrapper[4763]: I1006 16:34:01.042169 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnflk\" (UniqueName: \"kubernetes.io/projected/5444e6ef-05c2-4c2d-b742-03edf03cb800-kube-api-access-pnflk\") pod \"5444e6ef-05c2-4c2d-b742-03edf03cb800\" (UID: \"5444e6ef-05c2-4c2d-b742-03edf03cb800\") " Oct 06 16:34:01 crc kubenswrapper[4763]: I1006 16:34:01.064962 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5444e6ef-05c2-4c2d-b742-03edf03cb800-kube-api-access-pnflk" (OuterVolumeSpecName: "kube-api-access-pnflk") pod "5444e6ef-05c2-4c2d-b742-03edf03cb800" (UID: "5444e6ef-05c2-4c2d-b742-03edf03cb800"). InnerVolumeSpecName "kube-api-access-pnflk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:34:01 crc kubenswrapper[4763]: I1006 16:34:01.144768 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnflk\" (UniqueName: \"kubernetes.io/projected/5444e6ef-05c2-4c2d-b742-03edf03cb800-kube-api-access-pnflk\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:01 crc kubenswrapper[4763]: I1006 16:34:01.526667 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3e6e-account-create-86nsx" event={"ID":"5444e6ef-05c2-4c2d-b742-03edf03cb800","Type":"ContainerDied","Data":"f604b968b1f8091a3198f64272e7f4d57dd4b92fc9749382bb310a04322f66f8"} Oct 06 16:34:01 crc kubenswrapper[4763]: I1006 16:34:01.527116 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f604b968b1f8091a3198f64272e7f4d57dd4b92fc9749382bb310a04322f66f8" Oct 06 16:34:01 crc kubenswrapper[4763]: I1006 16:34:01.527240 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3e6e-account-create-86nsx" Oct 06 16:34:02 crc kubenswrapper[4763]: I1006 16:34:02.849437 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-9bzwg"] Oct 06 16:34:02 crc kubenswrapper[4763]: E1006 16:34:02.850813 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5444e6ef-05c2-4c2d-b742-03edf03cb800" containerName="mariadb-account-create" Oct 06 16:34:02 crc kubenswrapper[4763]: I1006 16:34:02.850899 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5444e6ef-05c2-4c2d-b742-03edf03cb800" containerName="mariadb-account-create" Oct 06 16:34:02 crc kubenswrapper[4763]: I1006 16:34:02.851191 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5444e6ef-05c2-4c2d-b742-03edf03cb800" containerName="mariadb-account-create" Oct 06 16:34:02 crc kubenswrapper[4763]: I1006 16:34:02.852081 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:02 crc kubenswrapper[4763]: I1006 16:34:02.854536 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-r2ttq" Oct 06 16:34:02 crc kubenswrapper[4763]: I1006 16:34:02.857307 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 06 16:34:02 crc kubenswrapper[4763]: I1006 16:34:02.871156 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-9bzwg"] Oct 06 16:34:02 crc kubenswrapper[4763]: I1006 16:34:02.984343 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkmvx\" (UniqueName: \"kubernetes.io/projected/88647c36-cdad-4200-a1f6-e3d96d1492d9-kube-api-access-jkmvx\") pod \"heat-db-sync-9bzwg\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:02 crc kubenswrapper[4763]: I1006 16:34:02.984401 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-config-data\") pod \"heat-db-sync-9bzwg\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:02 crc kubenswrapper[4763]: I1006 16:34:02.984598 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-combined-ca-bundle\") pod \"heat-db-sync-9bzwg\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:03 crc kubenswrapper[4763]: I1006 16:34:03.086450 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-combined-ca-bundle\") pod \"heat-db-sync-9bzwg\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:03 crc kubenswrapper[4763]: I1006 16:34:03.086539 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkmvx\" (UniqueName: \"kubernetes.io/projected/88647c36-cdad-4200-a1f6-e3d96d1492d9-kube-api-access-jkmvx\") pod \"heat-db-sync-9bzwg\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:03 crc kubenswrapper[4763]: I1006 16:34:03.086568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-config-data\") pod \"heat-db-sync-9bzwg\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:03 crc kubenswrapper[4763]: I1006 16:34:03.092814 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-combined-ca-bundle\") pod \"heat-db-sync-9bzwg\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:03 crc kubenswrapper[4763]: I1006 16:34:03.100754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-config-data\") pod \"heat-db-sync-9bzwg\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:03 crc kubenswrapper[4763]: I1006 16:34:03.103491 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkmvx\" (UniqueName: \"kubernetes.io/projected/88647c36-cdad-4200-a1f6-e3d96d1492d9-kube-api-access-jkmvx\") pod \"heat-db-sync-9bzwg\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:03 crc kubenswrapper[4763]: I1006 16:34:03.177575 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:03 crc kubenswrapper[4763]: I1006 16:34:03.668135 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-9bzwg"] Oct 06 16:34:04 crc kubenswrapper[4763]: I1006 16:34:04.558558 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9bzwg" event={"ID":"88647c36-cdad-4200-a1f6-e3d96d1492d9","Type":"ContainerStarted","Data":"a601fd5dd21ccc75f0a17749b70f59786873ee9da1b83284a4129ae8d21ff245"} Oct 06 16:34:08 crc kubenswrapper[4763]: I1006 16:34:08.633476 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:34:10 crc kubenswrapper[4763]: I1006 16:34:10.341563 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5d5d79488f-hhjsh" Oct 06 16:34:10 crc kubenswrapper[4763]: I1006 16:34:10.472544 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c9cf99f6f-v9tb6"] Oct 06 16:34:10 crc kubenswrapper[4763]: I1006 16:34:10.473361 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c9cf99f6f-v9tb6" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon-log" containerID="cri-o://839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5" gracePeriod=30 Oct 06 16:34:10 crc kubenswrapper[4763]: I1006 16:34:10.473826 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c9cf99f6f-v9tb6" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon" containerID="cri-o://f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8" gracePeriod=30 Oct 06 16:34:11 crc kubenswrapper[4763]: I1006 16:34:11.653130 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9bzwg" event={"ID":"88647c36-cdad-4200-a1f6-e3d96d1492d9","Type":"ContainerStarted","Data":"98de6adbe357c1533a41e7534659c6f8289365ae839544a75546d049181da255"} Oct 06 16:34:11 crc kubenswrapper[4763]: I1006 16:34:11.672435 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-9bzwg" podStartSLOduration=2.529659692 podStartE2EDuration="9.6724106s" podCreationTimestamp="2025-10-06 16:34:02 +0000 UTC" firstStartedPulling="2025-10-06 16:34:03.679932979 +0000 UTC m=+6040.835225501" lastFinishedPulling="2025-10-06 16:34:10.822683897 +0000 UTC m=+6047.977976409" observedRunningTime="2025-10-06 16:34:11.670707054 +0000 UTC m=+6048.825999626" watchObservedRunningTime="2025-10-06 16:34:11.6724106 +0000 UTC m=+6048.827703152" Oct 06 16:34:13 crc kubenswrapper[4763]: I1006 16:34:13.674736 4763 generic.go:334] "Generic (PLEG): container finished" podID="88647c36-cdad-4200-a1f6-e3d96d1492d9" containerID="98de6adbe357c1533a41e7534659c6f8289365ae839544a75546d049181da255" exitCode=0 Oct 06 16:34:13 crc kubenswrapper[4763]: I1006 16:34:13.674822 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9bzwg" event={"ID":"88647c36-cdad-4200-a1f6-e3d96d1492d9","Type":"ContainerDied","Data":"98de6adbe357c1533a41e7534659c6f8289365ae839544a75546d049181da255"} Oct 06 16:34:14 crc kubenswrapper[4763]: I1006 16:34:14.575965 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:34:14 crc kubenswrapper[4763]: E1006 16:34:14.577217 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:34:14 crc kubenswrapper[4763]: I1006 16:34:14.688183 4763 generic.go:334] "Generic (PLEG): container finished" podID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerID="f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8" exitCode=0 Oct 06 16:34:14 crc kubenswrapper[4763]: I1006 16:34:14.688252 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c9cf99f6f-v9tb6" event={"ID":"ed85b17e-a95b-48e1-9cb3-afec958e3ecd","Type":"ContainerDied","Data":"f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8"} Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.035041 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.127506 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-combined-ca-bundle\") pod \"88647c36-cdad-4200-a1f6-e3d96d1492d9\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.127583 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkmvx\" (UniqueName: \"kubernetes.io/projected/88647c36-cdad-4200-a1f6-e3d96d1492d9-kube-api-access-jkmvx\") pod \"88647c36-cdad-4200-a1f6-e3d96d1492d9\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.127692 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-config-data\") pod \"88647c36-cdad-4200-a1f6-e3d96d1492d9\" (UID: \"88647c36-cdad-4200-a1f6-e3d96d1492d9\") " Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.137393 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88647c36-cdad-4200-a1f6-e3d96d1492d9-kube-api-access-jkmvx" (OuterVolumeSpecName: "kube-api-access-jkmvx") pod "88647c36-cdad-4200-a1f6-e3d96d1492d9" (UID: "88647c36-cdad-4200-a1f6-e3d96d1492d9"). InnerVolumeSpecName "kube-api-access-jkmvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.164067 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88647c36-cdad-4200-a1f6-e3d96d1492d9" (UID: "88647c36-cdad-4200-a1f6-e3d96d1492d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.221046 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-config-data" (OuterVolumeSpecName: "config-data") pod "88647c36-cdad-4200-a1f6-e3d96d1492d9" (UID: "88647c36-cdad-4200-a1f6-e3d96d1492d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.230780 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.230878 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88647c36-cdad-4200-a1f6-e3d96d1492d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.230900 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkmvx\" (UniqueName: \"kubernetes.io/projected/88647c36-cdad-4200-a1f6-e3d96d1492d9-kube-api-access-jkmvx\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.698578 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9bzwg" event={"ID":"88647c36-cdad-4200-a1f6-e3d96d1492d9","Type":"ContainerDied","Data":"a601fd5dd21ccc75f0a17749b70f59786873ee9da1b83284a4129ae8d21ff245"} Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.698634 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a601fd5dd21ccc75f0a17749b70f59786873ee9da1b83284a4129ae8d21ff245" Oct 06 16:34:15 crc kubenswrapper[4763]: I1006 16:34:15.698654 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9bzwg" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.093586 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7fb6f64cd9-654qc"] Oct 06 16:34:17 crc kubenswrapper[4763]: E1006 16:34:17.094243 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88647c36-cdad-4200-a1f6-e3d96d1492d9" containerName="heat-db-sync" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.094255 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="88647c36-cdad-4200-a1f6-e3d96d1492d9" containerName="heat-db-sync" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.094446 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="88647c36-cdad-4200-a1f6-e3d96d1492d9" containerName="heat-db-sync" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.095075 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.098104 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-r2ttq" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.098274 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.098389 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.138241 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7fb6f64cd9-654qc"] Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.173962 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44cf3c94-1942-46b0-a032-0fe8427aeb43-combined-ca-bundle\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.174045 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44cf3c94-1942-46b0-a032-0fe8427aeb43-config-data\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.174175 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44cf3c94-1942-46b0-a032-0fe8427aeb43-config-data-custom\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.174264 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqq97\" (UniqueName: \"kubernetes.io/projected/44cf3c94-1942-46b0-a032-0fe8427aeb43-kube-api-access-bqq97\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.251748 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7d9998d569-7mv27"] Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.254164 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.257854 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.267200 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-65864c57c4-kcpk5"] Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.269409 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.271130 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.281519 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7d9998d569-7mv27"] Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.284429 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44cf3c94-1942-46b0-a032-0fe8427aeb43-combined-ca-bundle\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.284508 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44cf3c94-1942-46b0-a032-0fe8427aeb43-config-data\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.284728 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44cf3c94-1942-46b0-a032-0fe8427aeb43-config-data-custom\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.284815 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqq97\" (UniqueName: \"kubernetes.io/projected/44cf3c94-1942-46b0-a032-0fe8427aeb43-kube-api-access-bqq97\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.293146 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44cf3c94-1942-46b0-a032-0fe8427aeb43-combined-ca-bundle\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.293701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44cf3c94-1942-46b0-a032-0fe8427aeb43-config-data\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.302097 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-65864c57c4-kcpk5"] Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.310675 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44cf3c94-1942-46b0-a032-0fe8427aeb43-config-data-custom\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.333812 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqq97\" (UniqueName: \"kubernetes.io/projected/44cf3c94-1942-46b0-a032-0fe8427aeb43-kube-api-access-bqq97\") pod \"heat-engine-7fb6f64cd9-654qc\" (UID: \"44cf3c94-1942-46b0-a032-0fe8427aeb43\") " pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.387151 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7fl8\" (UniqueName: \"kubernetes.io/projected/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-kube-api-access-d7fl8\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.387206 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsxrr\" (UniqueName: \"kubernetes.io/projected/de6851e6-2f99-4200-8577-6b43830fe709-kube-api-access-nsxrr\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.387234 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-config-data\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.387259 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6851e6-2f99-4200-8577-6b43830fe709-config-data-custom\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.387407 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6851e6-2f99-4200-8577-6b43830fe709-config-data\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.387557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6851e6-2f99-4200-8577-6b43830fe709-combined-ca-bundle\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.387661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-combined-ca-bundle\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.387827 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-config-data-custom\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.424541 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.490754 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-config-data-custom\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.491068 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7fl8\" (UniqueName: \"kubernetes.io/projected/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-kube-api-access-d7fl8\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.491214 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsxrr\" (UniqueName: \"kubernetes.io/projected/de6851e6-2f99-4200-8577-6b43830fe709-kube-api-access-nsxrr\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.491301 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-config-data\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.491383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6851e6-2f99-4200-8577-6b43830fe709-config-data-custom\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.491451 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6851e6-2f99-4200-8577-6b43830fe709-config-data\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.491556 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6851e6-2f99-4200-8577-6b43830fe709-combined-ca-bundle\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.491632 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-combined-ca-bundle\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.497583 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-combined-ca-bundle\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.497732 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-config-data-custom\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.499817 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-config-data\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.502516 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6851e6-2f99-4200-8577-6b43830fe709-config-data\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.511731 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6851e6-2f99-4200-8577-6b43830fe709-combined-ca-bundle\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.514465 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6851e6-2f99-4200-8577-6b43830fe709-config-data-custom\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.518359 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsxrr\" (UniqueName: \"kubernetes.io/projected/de6851e6-2f99-4200-8577-6b43830fe709-kube-api-access-nsxrr\") pod \"heat-api-7d9998d569-7mv27\" (UID: \"de6851e6-2f99-4200-8577-6b43830fe709\") " pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.519656 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7fl8\" (UniqueName: \"kubernetes.io/projected/1ae45c3e-fd98-44d2-8900-b46cc7d1428e-kube-api-access-d7fl8\") pod \"heat-cfnapi-65864c57c4-kcpk5\" (UID: \"1ae45c3e-fd98-44d2-8900-b46cc7d1428e\") " pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.610256 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.700178 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:17 crc kubenswrapper[4763]: I1006 16:34:17.980826 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7fb6f64cd9-654qc"] Oct 06 16:34:18 crc kubenswrapper[4763]: I1006 16:34:18.126890 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7d9998d569-7mv27"] Oct 06 16:34:18 crc kubenswrapper[4763]: I1006 16:34:18.393191 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-65864c57c4-kcpk5"] Oct 06 16:34:18 crc kubenswrapper[4763]: I1006 16:34:18.521960 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c9cf99f6f-v9tb6" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 06 16:34:18 crc kubenswrapper[4763]: I1006 16:34:18.732434 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65864c57c4-kcpk5" event={"ID":"1ae45c3e-fd98-44d2-8900-b46cc7d1428e","Type":"ContainerStarted","Data":"22d839c8eb8648e945c084f667b70c641fedf9938c497d715453817b7a11cc61"} Oct 06 16:34:18 crc kubenswrapper[4763]: I1006 16:34:18.734507 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7fb6f64cd9-654qc" event={"ID":"44cf3c94-1942-46b0-a032-0fe8427aeb43","Type":"ContainerStarted","Data":"3d2588f9e66f9c41f1b9533ee50f1cfcd8177632af9bcf1f4371b0e2c65811e3"} Oct 06 16:34:18 crc kubenswrapper[4763]: I1006 16:34:18.734575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7fb6f64cd9-654qc" event={"ID":"44cf3c94-1942-46b0-a032-0fe8427aeb43","Type":"ContainerStarted","Data":"9f58dae6b061e61ebf720940aba702f4906ae558017e44619adda1093b7add5c"} Oct 06 16:34:18 crc kubenswrapper[4763]: I1006 16:34:18.735721 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:18 crc kubenswrapper[4763]: I1006 16:34:18.736525 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7d9998d569-7mv27" event={"ID":"de6851e6-2f99-4200-8577-6b43830fe709","Type":"ContainerStarted","Data":"a1c827bb71092b71271fc1a8fd8d516db697ac00981937d96c62d5ea81232e5f"} Oct 06 16:34:18 crc kubenswrapper[4763]: I1006 16:34:18.756516 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7fb6f64cd9-654qc" podStartSLOduration=1.756497774 podStartE2EDuration="1.756497774s" podCreationTimestamp="2025-10-06 16:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:34:18.747304754 +0000 UTC m=+6055.902597266" watchObservedRunningTime="2025-10-06 16:34:18.756497774 +0000 UTC m=+6055.911790286" Oct 06 16:34:20 crc kubenswrapper[4763]: I1006 16:34:20.757054 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7d9998d569-7mv27" event={"ID":"de6851e6-2f99-4200-8577-6b43830fe709","Type":"ContainerStarted","Data":"a14cbd1eb00bef61f74c2953c4148b3940f6e606bebae3a39d979f37b41ef484"} Oct 06 16:34:20 crc kubenswrapper[4763]: I1006 16:34:20.757633 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:20 crc kubenswrapper[4763]: I1006 16:34:20.758305 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65864c57c4-kcpk5" event={"ID":"1ae45c3e-fd98-44d2-8900-b46cc7d1428e","Type":"ContainerStarted","Data":"ba2960aee697d3f015143cfc9987c46d281a71622319345151976e7c4bc6a587"} Oct 06 16:34:20 crc kubenswrapper[4763]: I1006 16:34:20.758435 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:20 crc kubenswrapper[4763]: I1006 16:34:20.793777 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-65864c57c4-kcpk5" podStartSLOduration=2.251276715 podStartE2EDuration="3.793760428s" podCreationTimestamp="2025-10-06 16:34:17 +0000 UTC" firstStartedPulling="2025-10-06 16:34:18.396545145 +0000 UTC m=+6055.551837657" lastFinishedPulling="2025-10-06 16:34:19.939028868 +0000 UTC m=+6057.094321370" observedRunningTime="2025-10-06 16:34:20.791602149 +0000 UTC m=+6057.946894661" watchObservedRunningTime="2025-10-06 16:34:20.793760428 +0000 UTC m=+6057.949052940" Oct 06 16:34:20 crc kubenswrapper[4763]: I1006 16:34:20.794562 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7d9998d569-7mv27" podStartSLOduration=2.000920023 podStartE2EDuration="3.794555299s" podCreationTimestamp="2025-10-06 16:34:17 +0000 UTC" firstStartedPulling="2025-10-06 16:34:18.144078667 +0000 UTC m=+6055.299371179" lastFinishedPulling="2025-10-06 16:34:19.937713943 +0000 UTC m=+6057.093006455" observedRunningTime="2025-10-06 16:34:20.779701516 +0000 UTC m=+6057.934994038" watchObservedRunningTime="2025-10-06 16:34:20.794555299 +0000 UTC m=+6057.949847801" Oct 06 16:34:28 crc kubenswrapper[4763]: I1006 16:34:28.521812 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c9cf99f6f-v9tb6" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 06 16:34:28 crc kubenswrapper[4763]: I1006 16:34:28.574829 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:34:28 crc kubenswrapper[4763]: E1006 16:34:28.575086 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:34:28 crc kubenswrapper[4763]: I1006 16:34:28.937109 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7d9998d569-7mv27" Oct 06 16:34:29 crc kubenswrapper[4763]: I1006 16:34:29.033651 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-65864c57c4-kcpk5" Oct 06 16:34:31 crc kubenswrapper[4763]: I1006 16:34:31.751208 4763 scope.go:117] "RemoveContainer" containerID="a7d15db469d172b6ba5d469e63d2504978cb1d0e40b57b9712c2a47b0b54b2bb" Oct 06 16:34:31 crc kubenswrapper[4763]: I1006 16:34:31.791435 4763 scope.go:117] "RemoveContainer" containerID="9b7a16253acbc46f0bad8196bab9cb8208fb7a6a90c91cdb5b6489255f739db5" Oct 06 16:34:31 crc kubenswrapper[4763]: I1006 16:34:31.830846 4763 scope.go:117] "RemoveContainer" containerID="e4e808278d1d9bd74d61ccbced436c2a8868e7f6534c6ddd0253f1f07c6861aa" Oct 06 16:34:37 crc kubenswrapper[4763]: I1006 16:34:37.482595 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7fb6f64cd9-654qc" Oct 06 16:34:38 crc kubenswrapper[4763]: I1006 16:34:38.520964 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c9cf99f6f-v9tb6" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 06 16:34:38 crc kubenswrapper[4763]: I1006 16:34:38.521371 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:34:40 crc kubenswrapper[4763]: I1006 16:34:40.959227 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:34:40 crc kubenswrapper[4763]: I1006 16:34:40.995248 4763 generic.go:334] "Generic (PLEG): container finished" podID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerID="839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5" exitCode=137 Oct 06 16:34:40 crc kubenswrapper[4763]: I1006 16:34:40.995289 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c9cf99f6f-v9tb6" event={"ID":"ed85b17e-a95b-48e1-9cb3-afec958e3ecd","Type":"ContainerDied","Data":"839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5"} Oct 06 16:34:40 crc kubenswrapper[4763]: I1006 16:34:40.995311 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c9cf99f6f-v9tb6" event={"ID":"ed85b17e-a95b-48e1-9cb3-afec958e3ecd","Type":"ContainerDied","Data":"7bb1cb7a0e1459c528a2417315f2e26d3223b22ae5c9392b6bdeb6992701bb18"} Oct 06 16:34:40 crc kubenswrapper[4763]: I1006 16:34:40.995327 4763 scope.go:117] "RemoveContainer" containerID="f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8" Oct 06 16:34:40 crc kubenswrapper[4763]: I1006 16:34:40.995448 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c9cf99f6f-v9tb6" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.066035 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-config-data\") pod \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.067143 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2qc4\" (UniqueName: \"kubernetes.io/projected/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-kube-api-access-n2qc4\") pod \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.067226 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-scripts\") pod \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.067303 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-logs\") pod \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.067425 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-horizon-secret-key\") pod \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\" (UID: \"ed85b17e-a95b-48e1-9cb3-afec958e3ecd\") " Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.068299 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-logs" (OuterVolumeSpecName: "logs") pod "ed85b17e-a95b-48e1-9cb3-afec958e3ecd" (UID: "ed85b17e-a95b-48e1-9cb3-afec958e3ecd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.072797 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ed85b17e-a95b-48e1-9cb3-afec958e3ecd" (UID: "ed85b17e-a95b-48e1-9cb3-afec958e3ecd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.072963 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-kube-api-access-n2qc4" (OuterVolumeSpecName: "kube-api-access-n2qc4") pod "ed85b17e-a95b-48e1-9cb3-afec958e3ecd" (UID: "ed85b17e-a95b-48e1-9cb3-afec958e3ecd"). InnerVolumeSpecName "kube-api-access-n2qc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.093509 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-config-data" (OuterVolumeSpecName: "config-data") pod "ed85b17e-a95b-48e1-9cb3-afec958e3ecd" (UID: "ed85b17e-a95b-48e1-9cb3-afec958e3ecd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.103466 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-scripts" (OuterVolumeSpecName: "scripts") pod "ed85b17e-a95b-48e1-9cb3-afec958e3ecd" (UID: "ed85b17e-a95b-48e1-9cb3-afec958e3ecd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.161920 4763 scope.go:117] "RemoveContainer" containerID="839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.169944 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.169974 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.169985 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2qc4\" (UniqueName: \"kubernetes.io/projected/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-kube-api-access-n2qc4\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.169997 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.170009 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed85b17e-a95b-48e1-9cb3-afec958e3ecd-logs\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.188760 4763 scope.go:117] "RemoveContainer" containerID="f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8" Oct 06 16:34:41 crc kubenswrapper[4763]: E1006 16:34:41.189322 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8\": container with ID starting with f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8 not found: ID does not exist" containerID="f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.189368 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8"} err="failed to get container status \"f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8\": rpc error: code = NotFound desc = could not find container \"f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8\": container with ID starting with f588fb773fb16ad68fa210fe70097d7f9663304d8d8bc53926f4b22e752b02e8 not found: ID does not exist" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.189396 4763 scope.go:117] "RemoveContainer" containerID="839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5" Oct 06 16:34:41 crc kubenswrapper[4763]: E1006 16:34:41.189804 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5\": container with ID starting with 839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5 not found: ID does not exist" containerID="839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.189846 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5"} err="failed to get container status \"839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5\": rpc error: code = NotFound desc = could not find container \"839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5\": container with ID starting with 839009cd64e479faa228ef8ec34ef8ced9c420f5f46bee9b583edc9ac0fee4c5 not found: ID does not exist" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.341253 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c9cf99f6f-v9tb6"] Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.348054 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-c9cf99f6f-v9tb6"] Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.575672 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:34:41 crc kubenswrapper[4763]: E1006 16:34:41.575949 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:34:41 crc kubenswrapper[4763]: I1006 16:34:41.585655 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" path="/var/lib/kubelet/pods/ed85b17e-a95b-48e1-9cb3-afec958e3ecd/volumes" Oct 06 16:34:45 crc kubenswrapper[4763]: I1006 16:34:45.054294 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vmffx"] Oct 06 16:34:45 crc kubenswrapper[4763]: I1006 16:34:45.074588 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vmffx"] Oct 06 16:34:45 crc kubenswrapper[4763]: I1006 16:34:45.590093 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="774c046f-c1cb-440b-9e1e-3515d4d41738" path="/var/lib/kubelet/pods/774c046f-c1cb-440b-9e1e-3515d4d41738/volumes" Oct 06 16:34:46 crc kubenswrapper[4763]: I1006 16:34:46.030380 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-fz6gw"] Oct 06 16:34:46 crc kubenswrapper[4763]: I1006 16:34:46.041480 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-fz6gw"] Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.019267 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg"] Oct 06 16:34:47 crc kubenswrapper[4763]: E1006 16:34:47.021726 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.021895 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon" Oct 06 16:34:47 crc kubenswrapper[4763]: E1006 16:34:47.022065 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon-log" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.022176 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon-log" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.022724 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon-log" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.022924 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed85b17e-a95b-48e1-9cb3-afec958e3ecd" containerName="horizon" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.026281 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.033049 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.037193 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-7f98c"] Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.051569 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-7f98c"] Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.065593 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg"] Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.102206 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r4nd\" (UniqueName: \"kubernetes.io/projected/a27c468f-169a-43cb-8cb4-9265e9d62b63-kube-api-access-7r4nd\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.102405 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.102463 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.204157 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.204212 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.204323 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r4nd\" (UniqueName: \"kubernetes.io/projected/a27c468f-169a-43cb-8cb4-9265e9d62b63-kube-api-access-7r4nd\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.204650 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.204934 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.234971 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r4nd\" (UniqueName: \"kubernetes.io/projected/a27c468f-169a-43cb-8cb4-9265e9d62b63-kube-api-access-7r4nd\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.355438 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.594306 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33bdf8ee-5273-48e3-9c11-bd14bb06998a" path="/var/lib/kubelet/pods/33bdf8ee-5273-48e3-9c11-bd14bb06998a/volumes" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.596070 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b733de-b169-483e-b598-9767d7a7f76d" path="/var/lib/kubelet/pods/49b733de-b169-483e-b598-9767d7a7f76d/volumes" Oct 06 16:34:47 crc kubenswrapper[4763]: I1006 16:34:47.811070 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg"] Oct 06 16:34:48 crc kubenswrapper[4763]: I1006 16:34:48.070856 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" event={"ID":"a27c468f-169a-43cb-8cb4-9265e9d62b63","Type":"ContainerStarted","Data":"667b0fa32acd81df6e4222e05477ddf8a9ed6b2d0250b13b9af1d48eadf5472b"} Oct 06 16:34:48 crc kubenswrapper[4763]: I1006 16:34:48.070910 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" event={"ID":"a27c468f-169a-43cb-8cb4-9265e9d62b63","Type":"ContainerStarted","Data":"48a2ec9412d54490ba146b904a79e34915c641cfb12e4f764b46999430bd2732"} Oct 06 16:34:49 crc kubenswrapper[4763]: I1006 16:34:49.101148 4763 generic.go:334] "Generic (PLEG): container finished" podID="a27c468f-169a-43cb-8cb4-9265e9d62b63" containerID="667b0fa32acd81df6e4222e05477ddf8a9ed6b2d0250b13b9af1d48eadf5472b" exitCode=0 Oct 06 16:34:49 crc kubenswrapper[4763]: I1006 16:34:49.101543 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" event={"ID":"a27c468f-169a-43cb-8cb4-9265e9d62b63","Type":"ContainerDied","Data":"667b0fa32acd81df6e4222e05477ddf8a9ed6b2d0250b13b9af1d48eadf5472b"} Oct 06 16:34:54 crc kubenswrapper[4763]: I1006 16:34:54.575228 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:34:54 crc kubenswrapper[4763]: E1006 16:34:54.576113 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:34:55 crc kubenswrapper[4763]: I1006 16:34:55.184896 4763 generic.go:334] "Generic (PLEG): container finished" podID="a27c468f-169a-43cb-8cb4-9265e9d62b63" containerID="ff4c95f8313b7f63f4a83ced8ed9cb5ec8ac261c8278d008345ff6c0397cd384" exitCode=0 Oct 06 16:34:55 crc kubenswrapper[4763]: I1006 16:34:55.184971 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" event={"ID":"a27c468f-169a-43cb-8cb4-9265e9d62b63","Type":"ContainerDied","Data":"ff4c95f8313b7f63f4a83ced8ed9cb5ec8ac261c8278d008345ff6c0397cd384"} Oct 06 16:34:56 crc kubenswrapper[4763]: I1006 16:34:56.197239 4763 generic.go:334] "Generic (PLEG): container finished" podID="a27c468f-169a-43cb-8cb4-9265e9d62b63" containerID="f0af8d21ba993fc604105617a133efd120b3c4c9ba94605b333af856c949a81b" exitCode=0 Oct 06 16:34:56 crc kubenswrapper[4763]: I1006 16:34:56.197286 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" event={"ID":"a27c468f-169a-43cb-8cb4-9265e9d62b63","Type":"ContainerDied","Data":"f0af8d21ba993fc604105617a133efd120b3c4c9ba94605b333af856c949a81b"} Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.033596 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2816-account-create-mblr6"] Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.046523 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-89be-account-create-j47ck"] Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.063985 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2816-account-create-mblr6"] Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.077519 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d6c8-account-create-5rvxp"] Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.085973 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d6c8-account-create-5rvxp"] Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.093011 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-89be-account-create-j47ck"] Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.538157 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.599288 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379142c1-b420-458d-bc29-1102ef3ff096" path="/var/lib/kubelet/pods/379142c1-b420-458d-bc29-1102ef3ff096/volumes" Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.600808 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f21252-6e8d-4f97-a3fa-c3f18afee924" path="/var/lib/kubelet/pods/98f21252-6e8d-4f97-a3fa-c3f18afee924/volumes" Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.602074 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3f3df5-ca95-4209-abc7-84150d8acb0f" path="/var/lib/kubelet/pods/ba3f3df5-ca95-4209-abc7-84150d8acb0f/volumes" Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.636723 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-util\") pod \"a27c468f-169a-43cb-8cb4-9265e9d62b63\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.636863 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r4nd\" (UniqueName: \"kubernetes.io/projected/a27c468f-169a-43cb-8cb4-9265e9d62b63-kube-api-access-7r4nd\") pod \"a27c468f-169a-43cb-8cb4-9265e9d62b63\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.636954 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-bundle\") pod \"a27c468f-169a-43cb-8cb4-9265e9d62b63\" (UID: \"a27c468f-169a-43cb-8cb4-9265e9d62b63\") " Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.639370 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-bundle" (OuterVolumeSpecName: "bundle") pod "a27c468f-169a-43cb-8cb4-9265e9d62b63" (UID: "a27c468f-169a-43cb-8cb4-9265e9d62b63"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.642974 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27c468f-169a-43cb-8cb4-9265e9d62b63-kube-api-access-7r4nd" (OuterVolumeSpecName: "kube-api-access-7r4nd") pod "a27c468f-169a-43cb-8cb4-9265e9d62b63" (UID: "a27c468f-169a-43cb-8cb4-9265e9d62b63"). InnerVolumeSpecName "kube-api-access-7r4nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.653209 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-util" (OuterVolumeSpecName: "util") pod "a27c468f-169a-43cb-8cb4-9265e9d62b63" (UID: "a27c468f-169a-43cb-8cb4-9265e9d62b63"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.741730 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-util\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.741759 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r4nd\" (UniqueName: \"kubernetes.io/projected/a27c468f-169a-43cb-8cb4-9265e9d62b63-kube-api-access-7r4nd\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:57 crc kubenswrapper[4763]: I1006 16:34:57.741772 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a27c468f-169a-43cb-8cb4-9265e9d62b63-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:34:58 crc kubenswrapper[4763]: I1006 16:34:58.224935 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" event={"ID":"a27c468f-169a-43cb-8cb4-9265e9d62b63","Type":"ContainerDied","Data":"48a2ec9412d54490ba146b904a79e34915c641cfb12e4f764b46999430bd2732"} Oct 06 16:34:58 crc kubenswrapper[4763]: I1006 16:34:58.224988 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48a2ec9412d54490ba146b904a79e34915c641cfb12e4f764b46999430bd2732" Oct 06 16:34:58 crc kubenswrapper[4763]: I1006 16:34:58.225010 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg" Oct 06 16:35:05 crc kubenswrapper[4763]: I1006 16:35:05.068182 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jq6fm"] Oct 06 16:35:05 crc kubenswrapper[4763]: I1006 16:35:05.078995 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jq6fm"] Oct 06 16:35:05 crc kubenswrapper[4763]: I1006 16:35:05.588488 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace" path="/var/lib/kubelet/pods/1eda57bb-e3ff-4c43-bc1b-f0bfdc748ace/volumes" Oct 06 16:35:06 crc kubenswrapper[4763]: I1006 16:35:06.574783 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:35:06 crc kubenswrapper[4763]: E1006 16:35:06.575340 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.252083 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68"] Oct 06 16:35:11 crc kubenswrapper[4763]: E1006 16:35:11.252695 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27c468f-169a-43cb-8cb4-9265e9d62b63" containerName="util" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.252706 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27c468f-169a-43cb-8cb4-9265e9d62b63" containerName="util" Oct 06 16:35:11 crc kubenswrapper[4763]: E1006 16:35:11.252720 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27c468f-169a-43cb-8cb4-9265e9d62b63" containerName="extract" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.252726 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27c468f-169a-43cb-8cb4-9265e9d62b63" containerName="extract" Oct 06 16:35:11 crc kubenswrapper[4763]: E1006 16:35:11.252746 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27c468f-169a-43cb-8cb4-9265e9d62b63" containerName="pull" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.252752 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27c468f-169a-43cb-8cb4-9265e9d62b63" containerName="pull" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.252935 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27c468f-169a-43cb-8cb4-9265e9d62b63" containerName="extract" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.253645 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.259977 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.278820 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-gpnsd" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.279125 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.313546 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68"] Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.353001 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vbvx\" (UniqueName: \"kubernetes.io/projected/10de2cae-960d-44d9-aa93-1daddae117ed-kube-api-access-5vbvx\") pod \"obo-prometheus-operator-7c8cf85677-p7j68\" (UID: \"10de2cae-960d-44d9-aa93-1daddae117ed\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.381732 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg"] Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.384063 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.386487 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-9sfgd" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.386813 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.397059 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6"] Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.398992 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.432909 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg"] Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.454833 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a462d496-370d-4f61-adf3-ce12c97ef3be-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6\" (UID: \"a462d496-370d-4f61-adf3-ce12c97ef3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.454926 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ae1ce1b-34cc-4085-b248-4e17a14b098b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg\" (UID: \"8ae1ce1b-34cc-4085-b248-4e17a14b098b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.454988 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vbvx\" (UniqueName: \"kubernetes.io/projected/10de2cae-960d-44d9-aa93-1daddae117ed-kube-api-access-5vbvx\") pod \"obo-prometheus-operator-7c8cf85677-p7j68\" (UID: \"10de2cae-960d-44d9-aa93-1daddae117ed\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.455046 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a462d496-370d-4f61-adf3-ce12c97ef3be-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6\" (UID: \"a462d496-370d-4f61-adf3-ce12c97ef3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.455075 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ae1ce1b-34cc-4085-b248-4e17a14b098b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg\" (UID: \"8ae1ce1b-34cc-4085-b248-4e17a14b098b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.466328 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6"] Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.494956 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vbvx\" (UniqueName: \"kubernetes.io/projected/10de2cae-960d-44d9-aa93-1daddae117ed-kube-api-access-5vbvx\") pod \"obo-prometheus-operator-7c8cf85677-p7j68\" (UID: \"10de2cae-960d-44d9-aa93-1daddae117ed\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.503806 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-mhpbv"] Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.505178 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.521840 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-64vxk" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.522014 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.528907 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-mhpbv"] Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.556998 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2ac9613-496b-40ce-b728-8ba688e8333c-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-mhpbv\" (UID: \"b2ac9613-496b-40ce-b728-8ba688e8333c\") " pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.557057 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ae1ce1b-34cc-4085-b248-4e17a14b098b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg\" (UID: \"8ae1ce1b-34cc-4085-b248-4e17a14b098b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.557367 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a462d496-370d-4f61-adf3-ce12c97ef3be-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6\" (UID: \"a462d496-370d-4f61-adf3-ce12c97ef3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.557448 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ae1ce1b-34cc-4085-b248-4e17a14b098b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg\" (UID: \"8ae1ce1b-34cc-4085-b248-4e17a14b098b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.557597 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a462d496-370d-4f61-adf3-ce12c97ef3be-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6\" (UID: \"a462d496-370d-4f61-adf3-ce12c97ef3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.557702 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfmc\" (UniqueName: \"kubernetes.io/projected/b2ac9613-496b-40ce-b728-8ba688e8333c-kube-api-access-wlfmc\") pod \"observability-operator-cc5f78dfc-mhpbv\" (UID: \"b2ac9613-496b-40ce-b728-8ba688e8333c\") " pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.563626 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ae1ce1b-34cc-4085-b248-4e17a14b098b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg\" (UID: \"8ae1ce1b-34cc-4085-b248-4e17a14b098b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.572145 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a462d496-370d-4f61-adf3-ce12c97ef3be-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6\" (UID: \"a462d496-370d-4f61-adf3-ce12c97ef3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.572487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ae1ce1b-34cc-4085-b248-4e17a14b098b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg\" (UID: \"8ae1ce1b-34cc-4085-b248-4e17a14b098b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.581314 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a462d496-370d-4f61-adf3-ce12c97ef3be-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6\" (UID: \"a462d496-370d-4f61-adf3-ce12c97ef3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.593907 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.659349 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfmc\" (UniqueName: \"kubernetes.io/projected/b2ac9613-496b-40ce-b728-8ba688e8333c-kube-api-access-wlfmc\") pod \"observability-operator-cc5f78dfc-mhpbv\" (UID: \"b2ac9613-496b-40ce-b728-8ba688e8333c\") " pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.659418 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2ac9613-496b-40ce-b728-8ba688e8333c-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-mhpbv\" (UID: \"b2ac9613-496b-40ce-b728-8ba688e8333c\") " pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.664107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2ac9613-496b-40ce-b728-8ba688e8333c-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-mhpbv\" (UID: \"b2ac9613-496b-40ce-b728-8ba688e8333c\") " pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.669763 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-7sql4"] Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.671347 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.675670 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-wtqjv" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.684367 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-7sql4"] Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.694050 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfmc\" (UniqueName: \"kubernetes.io/projected/b2ac9613-496b-40ce-b728-8ba688e8333c-kube-api-access-wlfmc\") pod \"observability-operator-cc5f78dfc-mhpbv\" (UID: \"b2ac9613-496b-40ce-b728-8ba688e8333c\") " pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.736424 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.748778 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.754259 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.763294 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd776ba-98eb-4204-939c-7ac1f4cd216a-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-7sql4\" (UID: \"ecd776ba-98eb-4204-939c-7ac1f4cd216a\") " pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.763526 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbnfs\" (UniqueName: \"kubernetes.io/projected/ecd776ba-98eb-4204-939c-7ac1f4cd216a-kube-api-access-nbnfs\") pod \"perses-operator-54bc95c9fb-7sql4\" (UID: \"ecd776ba-98eb-4204-939c-7ac1f4cd216a\") " pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.865419 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbnfs\" (UniqueName: \"kubernetes.io/projected/ecd776ba-98eb-4204-939c-7ac1f4cd216a-kube-api-access-nbnfs\") pod \"perses-operator-54bc95c9fb-7sql4\" (UID: \"ecd776ba-98eb-4204-939c-7ac1f4cd216a\") " pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.865892 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd776ba-98eb-4204-939c-7ac1f4cd216a-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-7sql4\" (UID: \"ecd776ba-98eb-4204-939c-7ac1f4cd216a\") " pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.866743 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd776ba-98eb-4204-939c-7ac1f4cd216a-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-7sql4\" (UID: \"ecd776ba-98eb-4204-939c-7ac1f4cd216a\") " pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" Oct 06 16:35:11 crc kubenswrapper[4763]: I1006 16:35:11.884093 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbnfs\" (UniqueName: \"kubernetes.io/projected/ecd776ba-98eb-4204-939c-7ac1f4cd216a-kube-api-access-nbnfs\") pod \"perses-operator-54bc95c9fb-7sql4\" (UID: \"ecd776ba-98eb-4204-939c-7ac1f4cd216a\") " pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" Oct 06 16:35:12 crc kubenswrapper[4763]: I1006 16:35:12.066127 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" Oct 06 16:35:12 crc kubenswrapper[4763]: I1006 16:35:12.114940 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68"] Oct 06 16:35:12 crc kubenswrapper[4763]: I1006 16:35:12.316069 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-mhpbv"] Oct 06 16:35:12 crc kubenswrapper[4763]: I1006 16:35:12.416986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" event={"ID":"b2ac9613-496b-40ce-b728-8ba688e8333c","Type":"ContainerStarted","Data":"029efac6641e04da9534815d94971730d358ff4757dce81653770a91b72b0066"} Oct 06 16:35:12 crc kubenswrapper[4763]: I1006 16:35:12.418507 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68" event={"ID":"10de2cae-960d-44d9-aa93-1daddae117ed","Type":"ContainerStarted","Data":"a3def1c0248e228424d7466b4907e1faf5fc710f1a93297d023439062b655d72"} Oct 06 16:35:12 crc kubenswrapper[4763]: W1006 16:35:12.461913 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ae1ce1b_34cc_4085_b248_4e17a14b098b.slice/crio-6fbf9f440b18fa97ab7f16f5675020c5674914ef98e765d34172b6ee23538175 WatchSource:0}: Error finding container 6fbf9f440b18fa97ab7f16f5675020c5674914ef98e765d34172b6ee23538175: Status 404 returned error can't find the container with id 6fbf9f440b18fa97ab7f16f5675020c5674914ef98e765d34172b6ee23538175 Oct 06 16:35:12 crc kubenswrapper[4763]: I1006 16:35:12.472136 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg"] Oct 06 16:35:12 crc kubenswrapper[4763]: I1006 16:35:12.485911 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6"] Oct 06 16:35:12 crc kubenswrapper[4763]: I1006 16:35:12.621734 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-7sql4"] Oct 06 16:35:12 crc kubenswrapper[4763]: W1006 16:35:12.634951 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd776ba_98eb_4204_939c_7ac1f4cd216a.slice/crio-f50eb9aa2cf7cb7ce8f3d5f1401034059f52cb3b22e247772e0e43532f88d42d WatchSource:0}: Error finding container f50eb9aa2cf7cb7ce8f3d5f1401034059f52cb3b22e247772e0e43532f88d42d: Status 404 returned error can't find the container with id f50eb9aa2cf7cb7ce8f3d5f1401034059f52cb3b22e247772e0e43532f88d42d Oct 06 16:35:13 crc kubenswrapper[4763]: I1006 16:35:13.434938 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" event={"ID":"ecd776ba-98eb-4204-939c-7ac1f4cd216a","Type":"ContainerStarted","Data":"f50eb9aa2cf7cb7ce8f3d5f1401034059f52cb3b22e247772e0e43532f88d42d"} Oct 06 16:35:13 crc kubenswrapper[4763]: I1006 16:35:13.437301 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" event={"ID":"a462d496-370d-4f61-adf3-ce12c97ef3be","Type":"ContainerStarted","Data":"d9ee8dabec748078e56b0bea082ceb17c0c3a6e34b16b979fff32b9ed7925646"} Oct 06 16:35:13 crc kubenswrapper[4763]: I1006 16:35:13.439278 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" event={"ID":"8ae1ce1b-34cc-4085-b248-4e17a14b098b","Type":"ContainerStarted","Data":"6fbf9f440b18fa97ab7f16f5675020c5674914ef98e765d34172b6ee23538175"} Oct 06 16:35:17 crc kubenswrapper[4763]: I1006 16:35:17.575062 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:35:17 crc kubenswrapper[4763]: E1006 16:35:17.575928 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.525821 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68" event={"ID":"10de2cae-960d-44d9-aa93-1daddae117ed","Type":"ContainerStarted","Data":"dca47d7cb1c7c1408dc7a7ef5d7123c8bbb7aa045fff76adc420909b793aade9"} Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.528876 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" event={"ID":"ecd776ba-98eb-4204-939c-7ac1f4cd216a","Type":"ContainerStarted","Data":"1a6e133d7b42fee4af91101a6ad4226b31e8761863fcd79020f180480d832156"} Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.529805 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.531866 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" event={"ID":"b2ac9613-496b-40ce-b728-8ba688e8333c","Type":"ContainerStarted","Data":"02e99c9185a7392c0044b9e512bdd89fc80e923e039d1d3a6dcd6cc3a14dacaa"} Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.533569 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.534543 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" event={"ID":"a462d496-370d-4f61-adf3-ce12c97ef3be","Type":"ContainerStarted","Data":"35bced495239ba36f992bbd5488bb0a2731b11af1b5ac293ec09be8695ddfa49"} Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.536498 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" event={"ID":"8ae1ce1b-34cc-4085-b248-4e17a14b098b","Type":"ContainerStarted","Data":"183fac060fc9d7a2fbbe38cdb645d8662d4587ffc7f9e0fabaa5920e325ea294"} Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.548819 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-p7j68" podStartSLOduration=2.3354107170000002 podStartE2EDuration="8.548804899s" podCreationTimestamp="2025-10-06 16:35:11 +0000 UTC" firstStartedPulling="2025-10-06 16:35:12.159528674 +0000 UTC m=+6109.314821186" lastFinishedPulling="2025-10-06 16:35:18.372922816 +0000 UTC m=+6115.528215368" observedRunningTime="2025-10-06 16:35:19.545204002 +0000 UTC m=+6116.700496514" watchObservedRunningTime="2025-10-06 16:35:19.548804899 +0000 UTC m=+6116.704097411" Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.564365 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.585090 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6" podStartSLOduration=2.693186977 podStartE2EDuration="8.585070265s" podCreationTimestamp="2025-10-06 16:35:11 +0000 UTC" firstStartedPulling="2025-10-06 16:35:12.486473026 +0000 UTC m=+6109.641765538" lastFinishedPulling="2025-10-06 16:35:18.378356314 +0000 UTC m=+6115.533648826" observedRunningTime="2025-10-06 16:35:19.573991664 +0000 UTC m=+6116.729284186" watchObservedRunningTime="2025-10-06 16:35:19.585070265 +0000 UTC m=+6116.740362787" Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.615548 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" podStartSLOduration=2.879161948 podStartE2EDuration="8.615525562s" podCreationTimestamp="2025-10-06 16:35:11 +0000 UTC" firstStartedPulling="2025-10-06 16:35:12.637376275 +0000 UTC m=+6109.792668807" lastFinishedPulling="2025-10-06 16:35:18.373739909 +0000 UTC m=+6115.529032421" observedRunningTime="2025-10-06 16:35:19.610440954 +0000 UTC m=+6116.765733476" watchObservedRunningTime="2025-10-06 16:35:19.615525562 +0000 UTC m=+6116.770818084" Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.640913 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-mhpbv" podStartSLOduration=2.546277356 podStartE2EDuration="8.640895021s" podCreationTimestamp="2025-10-06 16:35:11 +0000 UTC" firstStartedPulling="2025-10-06 16:35:12.331820665 +0000 UTC m=+6109.487113177" lastFinishedPulling="2025-10-06 16:35:18.42643832 +0000 UTC m=+6115.581730842" observedRunningTime="2025-10-06 16:35:19.629290816 +0000 UTC m=+6116.784583338" watchObservedRunningTime="2025-10-06 16:35:19.640895021 +0000 UTC m=+6116.796187533" Oct 06 16:35:19 crc kubenswrapper[4763]: I1006 16:35:19.654315 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg" podStartSLOduration=2.752404145 podStartE2EDuration="8.654295065s" podCreationTimestamp="2025-10-06 16:35:11 +0000 UTC" firstStartedPulling="2025-10-06 16:35:12.466203705 +0000 UTC m=+6109.621496217" lastFinishedPulling="2025-10-06 16:35:18.368094625 +0000 UTC m=+6115.523387137" observedRunningTime="2025-10-06 16:35:19.648053476 +0000 UTC m=+6116.803345988" watchObservedRunningTime="2025-10-06 16:35:19.654295065 +0000 UTC m=+6116.809587577" Oct 06 16:35:20 crc kubenswrapper[4763]: I1006 16:35:20.101354 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-964hw"] Oct 06 16:35:20 crc kubenswrapper[4763]: I1006 16:35:20.146390 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-ldxd4"] Oct 06 16:35:20 crc kubenswrapper[4763]: I1006 16:35:20.157506 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-964hw"] Oct 06 16:35:20 crc kubenswrapper[4763]: I1006 16:35:20.178341 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-ldxd4"] Oct 06 16:35:21 crc kubenswrapper[4763]: I1006 16:35:21.592600 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d391b2-f609-4a0e-857a-5c64ab6168f5" path="/var/lib/kubelet/pods/19d391b2-f609-4a0e-857a-5c64ab6168f5/volumes" Oct 06 16:35:21 crc kubenswrapper[4763]: I1006 16:35:21.593601 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d34157a-16c1-44e4-b187-8b1294ac635b" path="/var/lib/kubelet/pods/2d34157a-16c1-44e4-b187-8b1294ac635b/volumes" Oct 06 16:35:31 crc kubenswrapper[4763]: I1006 16:35:31.574884 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:35:31 crc kubenswrapper[4763]: E1006 16:35:31.575715 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:35:31 crc kubenswrapper[4763]: I1006 16:35:31.980865 4763 scope.go:117] "RemoveContainer" containerID="805c82b71bc72f8ec1d422d3b63b4fdfd281515aa4f3d5d9cf69fb0ce2bd2e10" Oct 06 16:35:32 crc kubenswrapper[4763]: I1006 16:35:32.036198 4763 scope.go:117] "RemoveContainer" containerID="f76aa74d43b76d76c38989ea818a0dd1788cb57cfa903bc157b89141b9c66786" Oct 06 16:35:32 crc kubenswrapper[4763]: I1006 16:35:32.069724 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-7sql4" Oct 06 16:35:32 crc kubenswrapper[4763]: I1006 16:35:32.120929 4763 scope.go:117] "RemoveContainer" containerID="7c01df081de2b96a4a52010e21a640be99247524173b40054f496c8248373aa0" Oct 06 16:35:32 crc kubenswrapper[4763]: I1006 16:35:32.187255 4763 scope.go:117] "RemoveContainer" containerID="f1f419461efdedbd202967e64c87778a6aa491c1ccc56a07ba0f8336180b15a5" Oct 06 16:35:32 crc kubenswrapper[4763]: I1006 16:35:32.210840 4763 scope.go:117] "RemoveContainer" containerID="d20fa5f94991af9a694579a7c1f5105f1fa8b7f35738e581b42e0420b69b5e37" Oct 06 16:35:32 crc kubenswrapper[4763]: I1006 16:35:32.272681 4763 scope.go:117] "RemoveContainer" containerID="1348fb122ebd3a676b1ba3eff24139d3621803eda8fe10fa277dd55f53ad0a56" Oct 06 16:35:32 crc kubenswrapper[4763]: I1006 16:35:32.295921 4763 scope.go:117] "RemoveContainer" containerID="96274dc5d4ab3bf3fd5e3b2f28ed2071a6a2afc83207c30295342e22f1b19bc8" Oct 06 16:35:32 crc kubenswrapper[4763]: I1006 16:35:32.324201 4763 scope.go:117] "RemoveContainer" containerID="e124679b91b485012312872ea98351c7927f485535fdc5d4bbf79f325ccef9a7" Oct 06 16:35:32 crc kubenswrapper[4763]: I1006 16:35:32.346418 4763 scope.go:117] "RemoveContainer" containerID="5e5bc6ecc7df58d82d9bf9e17179fb87d491bbcfd54767575ca65d1ea3ee58a7" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.588025 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.599894 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.644597 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 16:35:34 crc kubenswrapper[4763]: E1006 16:35:34.645297 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c8b569-3102-4145-94db-b8150854dbc9" containerName="openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.645316 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c8b569-3102-4145-94db-b8150854dbc9" containerName="openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.645527 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6c8b569-3102-4145-94db-b8150854dbc9" containerName="openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.646260 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.663347 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.703018 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f6c8b569-3102-4145-94db-b8150854dbc9" containerName="openstackclient" containerID="cri-o://a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad" gracePeriod=2 Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.709716 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f6c8b569-3102-4145-94db-b8150854dbc9" podUID="6b2385da-75ff-4a56-baf3-9632066140c6" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.744822 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmbsz\" (UniqueName: \"kubernetes.io/projected/6b2385da-75ff-4a56-baf3-9632066140c6-kube-api-access-hmbsz\") pod \"openstackclient\" (UID: \"6b2385da-75ff-4a56-baf3-9632066140c6\") " pod="openstack/openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.744872 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b2385da-75ff-4a56-baf3-9632066140c6-openstack-config\") pod \"openstackclient\" (UID: \"6b2385da-75ff-4a56-baf3-9632066140c6\") " pod="openstack/openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.745261 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b2385da-75ff-4a56-baf3-9632066140c6-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b2385da-75ff-4a56-baf3-9632066140c6\") " pod="openstack/openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.847033 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b2385da-75ff-4a56-baf3-9632066140c6-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b2385da-75ff-4a56-baf3-9632066140c6\") " pod="openstack/openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.847128 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmbsz\" (UniqueName: \"kubernetes.io/projected/6b2385da-75ff-4a56-baf3-9632066140c6-kube-api-access-hmbsz\") pod \"openstackclient\" (UID: \"6b2385da-75ff-4a56-baf3-9632066140c6\") " pod="openstack/openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.847160 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b2385da-75ff-4a56-baf3-9632066140c6-openstack-config\") pod \"openstackclient\" (UID: \"6b2385da-75ff-4a56-baf3-9632066140c6\") " pod="openstack/openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.848097 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b2385da-75ff-4a56-baf3-9632066140c6-openstack-config\") pod \"openstackclient\" (UID: \"6b2385da-75ff-4a56-baf3-9632066140c6\") " pod="openstack/openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.855826 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b2385da-75ff-4a56-baf3-9632066140c6-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b2385da-75ff-4a56-baf3-9632066140c6\") " pod="openstack/openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.877379 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmbsz\" (UniqueName: \"kubernetes.io/projected/6b2385da-75ff-4a56-baf3-9632066140c6-kube-api-access-hmbsz\") pod \"openstackclient\" (UID: \"6b2385da-75ff-4a56-baf3-9632066140c6\") " pod="openstack/openstackclient" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.924801 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.926039 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.929974 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-ss58n" Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.966137 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 16:35:34 crc kubenswrapper[4763]: I1006 16:35:34.978371 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.050487 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkgqq\" (UniqueName: \"kubernetes.io/projected/7804f1ac-7960-4f1e-9b93-33b457147757-kube-api-access-pkgqq\") pod \"kube-state-metrics-0\" (UID: \"7804f1ac-7960-4f1e-9b93-33b457147757\") " pod="openstack/kube-state-metrics-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.160877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkgqq\" (UniqueName: \"kubernetes.io/projected/7804f1ac-7960-4f1e-9b93-33b457147757-kube-api-access-pkgqq\") pod \"kube-state-metrics-0\" (UID: \"7804f1ac-7960-4f1e-9b93-33b457147757\") " pod="openstack/kube-state-metrics-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.197499 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkgqq\" (UniqueName: \"kubernetes.io/projected/7804f1ac-7960-4f1e-9b93-33b457147757-kube-api-access-pkgqq\") pod \"kube-state-metrics-0\" (UID: \"7804f1ac-7960-4f1e-9b93-33b457147757\") " pod="openstack/kube-state-metrics-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.328178 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.545704 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.548022 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.567250 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.567502 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.567689 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.568510 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-pjjqs" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.650390 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.678557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/da31df66-c543-4d37-9499-4265ef5ad835-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.683430 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da31df66-c543-4d37-9499-4265ef5ad835-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.683474 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da31df66-c543-4d37-9499-4265ef5ad835-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.683529 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/da31df66-c543-4d37-9499-4265ef5ad835-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.683672 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da31df66-c543-4d37-9499-4265ef5ad835-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.683781 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fh5k\" (UniqueName: \"kubernetes.io/projected/da31df66-c543-4d37-9499-4265ef5ad835-kube-api-access-8fh5k\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.786736 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da31df66-c543-4d37-9499-4265ef5ad835-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.786817 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fh5k\" (UniqueName: \"kubernetes.io/projected/da31df66-c543-4d37-9499-4265ef5ad835-kube-api-access-8fh5k\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.786890 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/da31df66-c543-4d37-9499-4265ef5ad835-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.786935 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da31df66-c543-4d37-9499-4265ef5ad835-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.786970 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da31df66-c543-4d37-9499-4265ef5ad835-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.786998 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/da31df66-c543-4d37-9499-4265ef5ad835-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.787807 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/da31df66-c543-4d37-9499-4265ef5ad835-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.797176 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/da31df66-c543-4d37-9499-4265ef5ad835-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.797285 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da31df66-c543-4d37-9499-4265ef5ad835-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.797736 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da31df66-c543-4d37-9499-4265ef5ad835-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.797789 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da31df66-c543-4d37-9499-4265ef5ad835-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.810048 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fh5k\" (UniqueName: \"kubernetes.io/projected/da31df66-c543-4d37-9499-4265ef5ad835-kube-api-access-8fh5k\") pod \"alertmanager-metric-storage-0\" (UID: \"da31df66-c543-4d37-9499-4265ef5ad835\") " pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:35 crc kubenswrapper[4763]: I1006 16:35:35.925101 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.038554 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 16:35:36 crc kubenswrapper[4763]: W1006 16:35:36.068155 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b2385da_75ff_4a56_baf3_9632066140c6.slice/crio-9fd51351c516d564b9ea66d8d81dcda9f6d7b179a2538ac97b72268cce1e0e13 WatchSource:0}: Error finding container 9fd51351c516d564b9ea66d8d81dcda9f6d7b179a2538ac97b72268cce1e0e13: Status 404 returned error can't find the container with id 9fd51351c516d564b9ea66d8d81dcda9f6d7b179a2538ac97b72268cce1e0e13 Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.192100 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.237664 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.237770 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.253785 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-n99w4" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.254166 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.254273 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.268838 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.269053 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.277677 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.317774 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bb77f67f-1b78-4be2-be5b-bae817e4cf46-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.317828 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb77f67f-1b78-4be2-be5b-bae817e4cf46-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.317863 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-878a6def-c1e2-4066-8e18-135b96d9543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-878a6def-c1e2-4066-8e18-135b96d9543c\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.317885 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bb77f67f-1b78-4be2-be5b-bae817e4cf46-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.317911 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb77f67f-1b78-4be2-be5b-bae817e4cf46-config\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.317940 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb77f67f-1b78-4be2-be5b-bae817e4cf46-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.318038 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwpg\" (UniqueName: \"kubernetes.io/projected/bb77f67f-1b78-4be2-be5b-bae817e4cf46-kube-api-access-2wwpg\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.318070 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb77f67f-1b78-4be2-be5b-bae817e4cf46-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.387881 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.419843 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwpg\" (UniqueName: \"kubernetes.io/projected/bb77f67f-1b78-4be2-be5b-bae817e4cf46-kube-api-access-2wwpg\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.419919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb77f67f-1b78-4be2-be5b-bae817e4cf46-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.420051 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bb77f67f-1b78-4be2-be5b-bae817e4cf46-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.420082 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb77f67f-1b78-4be2-be5b-bae817e4cf46-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.420127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-878a6def-c1e2-4066-8e18-135b96d9543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-878a6def-c1e2-4066-8e18-135b96d9543c\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.420158 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bb77f67f-1b78-4be2-be5b-bae817e4cf46-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.420195 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb77f67f-1b78-4be2-be5b-bae817e4cf46-config\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.420239 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb77f67f-1b78-4be2-be5b-bae817e4cf46-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.424538 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bb77f67f-1b78-4be2-be5b-bae817e4cf46-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.426578 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.426609 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-878a6def-c1e2-4066-8e18-135b96d9543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-878a6def-c1e2-4066-8e18-135b96d9543c\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2a13c3ae8c4f164e694447d2d7b07df36783613e16a095979631d28ace011afe/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.428972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb77f67f-1b78-4be2-be5b-bae817e4cf46-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.431651 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb77f67f-1b78-4be2-be5b-bae817e4cf46-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.447046 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bb77f67f-1b78-4be2-be5b-bae817e4cf46-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.457041 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwpg\" (UniqueName: \"kubernetes.io/projected/bb77f67f-1b78-4be2-be5b-bae817e4cf46-kube-api-access-2wwpg\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.464347 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb77f67f-1b78-4be2-be5b-bae817e4cf46-config\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.479310 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb77f67f-1b78-4be2-be5b-bae817e4cf46-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.642273 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-878a6def-c1e2-4066-8e18-135b96d9543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-878a6def-c1e2-4066-8e18-135b96d9543c\") pod \"prometheus-metric-storage-0\" (UID: \"bb77f67f-1b78-4be2-be5b-bae817e4cf46\") " pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.744434 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6b2385da-75ff-4a56-baf3-9632066140c6","Type":"ContainerStarted","Data":"9fd51351c516d564b9ea66d8d81dcda9f6d7b179a2538ac97b72268cce1e0e13"} Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.766755 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7804f1ac-7960-4f1e-9b93-33b457147757","Type":"ContainerStarted","Data":"6fc02a641cf678d569cf7e7bec85d98f37311f93835d1597efc0d340f283d6e2"} Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.926167 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 16:35:36 crc kubenswrapper[4763]: I1006 16:35:36.951859 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.246089 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.278101 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config\") pod \"f6c8b569-3102-4145-94db-b8150854dbc9\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.278144 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config-secret\") pod \"f6c8b569-3102-4145-94db-b8150854dbc9\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.279163 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xld87\" (UniqueName: \"kubernetes.io/projected/f6c8b569-3102-4145-94db-b8150854dbc9-kube-api-access-xld87\") pod \"f6c8b569-3102-4145-94db-b8150854dbc9\" (UID: \"f6c8b569-3102-4145-94db-b8150854dbc9\") " Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.305212 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c8b569-3102-4145-94db-b8150854dbc9-kube-api-access-xld87" (OuterVolumeSpecName: "kube-api-access-xld87") pod "f6c8b569-3102-4145-94db-b8150854dbc9" (UID: "f6c8b569-3102-4145-94db-b8150854dbc9"). InnerVolumeSpecName "kube-api-access-xld87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.366286 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f6c8b569-3102-4145-94db-b8150854dbc9" (UID: "f6c8b569-3102-4145-94db-b8150854dbc9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.384093 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.384124 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xld87\" (UniqueName: \"kubernetes.io/projected/f6c8b569-3102-4145-94db-b8150854dbc9-kube-api-access-xld87\") on node \"crc\" DevicePath \"\"" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.408763 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f6c8b569-3102-4145-94db-b8150854dbc9" (UID: "f6c8b569-3102-4145-94db-b8150854dbc9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.486482 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f6c8b569-3102-4145-94db-b8150854dbc9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.589662 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6c8b569-3102-4145-94db-b8150854dbc9" path="/var/lib/kubelet/pods/f6c8b569-3102-4145-94db-b8150854dbc9/volumes" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.595559 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.774604 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"da31df66-c543-4d37-9499-4265ef5ad835","Type":"ContainerStarted","Data":"c6078f60c8887de65791fc0c31811f1fcf2fba8364e0e2405445c465a3aec3dd"} Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.775963 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7804f1ac-7960-4f1e-9b93-33b457147757","Type":"ContainerStarted","Data":"9ae260af5d9ea4b67de7e0906e88d5aa3b5228300f3b2d5a3d99c8811742ffbd"} Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.777019 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.778704 4763 generic.go:334] "Generic (PLEG): container finished" podID="f6c8b569-3102-4145-94db-b8150854dbc9" containerID="a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad" exitCode=137 Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.778760 4763 scope.go:117] "RemoveContainer" containerID="a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.778775 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.784020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6b2385da-75ff-4a56-baf3-9632066140c6","Type":"ContainerStarted","Data":"fff7fd38afa3c2fbea8ae43f56769224a727a45eab1d6456aedf874b8d043cc1"} Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.785394 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bb77f67f-1b78-4be2-be5b-bae817e4cf46","Type":"ContainerStarted","Data":"830d1bd516903f95b9acc286ac3a7e3d75bddd44d7750801f369a8eac19b7a30"} Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.796994 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.311213279 podStartE2EDuration="3.796976134s" podCreationTimestamp="2025-10-06 16:35:34 +0000 UTC" firstStartedPulling="2025-10-06 16:35:36.449401777 +0000 UTC m=+6133.604694289" lastFinishedPulling="2025-10-06 16:35:36.935164632 +0000 UTC m=+6134.090457144" observedRunningTime="2025-10-06 16:35:37.789146112 +0000 UTC m=+6134.944438624" watchObservedRunningTime="2025-10-06 16:35:37.796976134 +0000 UTC m=+6134.952268646" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.814813 4763 scope.go:117] "RemoveContainer" containerID="a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.815190 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.815173489 podStartE2EDuration="3.815173489s" podCreationTimestamp="2025-10-06 16:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:35:37.805414564 +0000 UTC m=+6134.960707086" watchObservedRunningTime="2025-10-06 16:35:37.815173489 +0000 UTC m=+6134.970466001" Oct 06 16:35:37 crc kubenswrapper[4763]: E1006 16:35:37.815291 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad\": container with ID starting with a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad not found: ID does not exist" containerID="a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad" Oct 06 16:35:37 crc kubenswrapper[4763]: I1006 16:35:37.815343 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad"} err="failed to get container status \"a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad\": rpc error: code = NotFound desc = could not find container \"a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad\": container with ID starting with a893e0b4ddab8afaa478afbe40b1ac9c36646297619e7002d6de8a2787138aad not found: ID does not exist" Oct 06 16:35:39 crc kubenswrapper[4763]: I1006 16:35:39.030432 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6xhjm"] Oct 06 16:35:39 crc kubenswrapper[4763]: I1006 16:35:39.039269 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6xhjm"] Oct 06 16:35:39 crc kubenswrapper[4763]: I1006 16:35:39.589806 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18b8373-b3de-41e7-96c7-958f69e01594" path="/var/lib/kubelet/pods/d18b8373-b3de-41e7-96c7-958f69e01594/volumes" Oct 06 16:35:43 crc kubenswrapper[4763]: I1006 16:35:43.583822 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:35:43 crc kubenswrapper[4763]: E1006 16:35:43.584579 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:35:43 crc kubenswrapper[4763]: I1006 16:35:43.848153 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bb77f67f-1b78-4be2-be5b-bae817e4cf46","Type":"ContainerStarted","Data":"1b6ff8067708065fad43204e7895184f8956f9340cc9824725393a0edc59275d"} Oct 06 16:35:43 crc kubenswrapper[4763]: I1006 16:35:43.849761 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"da31df66-c543-4d37-9499-4265ef5ad835","Type":"ContainerStarted","Data":"1dfeab3ebcc3eee888bb82465bde2563e191479f3118043a58f0b9738bde9c8d"} Oct 06 16:35:45 crc kubenswrapper[4763]: I1006 16:35:45.336591 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 16:35:52 crc kubenswrapper[4763]: I1006 16:35:52.956595 4763 generic.go:334] "Generic (PLEG): container finished" podID="da31df66-c543-4d37-9499-4265ef5ad835" containerID="1dfeab3ebcc3eee888bb82465bde2563e191479f3118043a58f0b9738bde9c8d" exitCode=0 Oct 06 16:35:52 crc kubenswrapper[4763]: I1006 16:35:52.956724 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"da31df66-c543-4d37-9499-4265ef5ad835","Type":"ContainerDied","Data":"1dfeab3ebcc3eee888bb82465bde2563e191479f3118043a58f0b9738bde9c8d"} Oct 06 16:35:53 crc kubenswrapper[4763]: I1006 16:35:53.969071 4763 generic.go:334] "Generic (PLEG): container finished" podID="bb77f67f-1b78-4be2-be5b-bae817e4cf46" containerID="1b6ff8067708065fad43204e7895184f8956f9340cc9824725393a0edc59275d" exitCode=0 Oct 06 16:35:53 crc kubenswrapper[4763]: I1006 16:35:53.969156 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bb77f67f-1b78-4be2-be5b-bae817e4cf46","Type":"ContainerDied","Data":"1b6ff8067708065fad43204e7895184f8956f9340cc9824725393a0edc59275d"} Oct 06 16:35:55 crc kubenswrapper[4763]: I1006 16:35:55.575358 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:35:55 crc kubenswrapper[4763]: E1006 16:35:55.576049 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:35:55 crc kubenswrapper[4763]: I1006 16:35:55.993385 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"da31df66-c543-4d37-9499-4265ef5ad835","Type":"ContainerStarted","Data":"101ca771649b2019b0925a7ab58bba75eb79c1a4e78b9b48c7d82371dd303f37"} Oct 06 16:36:00 crc kubenswrapper[4763]: I1006 16:36:00.036476 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"da31df66-c543-4d37-9499-4265ef5ad835","Type":"ContainerStarted","Data":"84c1ec1b3553db83ea0b3a523be73c842a9e4b237d8042678bd46a34146f61dd"} Oct 06 16:36:00 crc kubenswrapper[4763]: I1006 16:36:00.038216 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 06 16:36:00 crc kubenswrapper[4763]: I1006 16:36:00.039060 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 06 16:36:00 crc kubenswrapper[4763]: I1006 16:36:00.106150 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.827047484 podStartE2EDuration="25.106128329s" podCreationTimestamp="2025-10-06 16:35:35 +0000 UTC" firstStartedPulling="2025-10-06 16:35:36.937709061 +0000 UTC m=+6134.093001573" lastFinishedPulling="2025-10-06 16:35:55.216789876 +0000 UTC m=+6152.372082418" observedRunningTime="2025-10-06 16:36:00.075521847 +0000 UTC m=+6157.230814359" watchObservedRunningTime="2025-10-06 16:36:00.106128329 +0000 UTC m=+6157.261420841" Oct 06 16:36:01 crc kubenswrapper[4763]: I1006 16:36:01.053031 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bb77f67f-1b78-4be2-be5b-bae817e4cf46","Type":"ContainerStarted","Data":"89a3984c80744042a5b03fdfbb46150f09194e58222315fabd2954db15ee37c7"} Oct 06 16:36:04 crc kubenswrapper[4763]: I1006 16:36:04.103884 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bb77f67f-1b78-4be2-be5b-bae817e4cf46","Type":"ContainerStarted","Data":"ae846eff3b73d99570cb2c477b3fcb7cf54180b4e714b56f41e0a19968386a82"} Oct 06 16:36:07 crc kubenswrapper[4763]: I1006 16:36:07.149897 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bb77f67f-1b78-4be2-be5b-bae817e4cf46","Type":"ContainerStarted","Data":"67b9e34c255d499e70bc8f33510527d02bc8d65e5b05543ba6e98f306831ecad"} Oct 06 16:36:07 crc kubenswrapper[4763]: I1006 16:36:07.191526 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.241554612 podStartE2EDuration="32.191479727s" podCreationTimestamp="2025-10-06 16:35:35 +0000 UTC" firstStartedPulling="2025-10-06 16:35:37.599662114 +0000 UTC m=+6134.754954626" lastFinishedPulling="2025-10-06 16:36:06.549587229 +0000 UTC m=+6163.704879741" observedRunningTime="2025-10-06 16:36:07.182856123 +0000 UTC m=+6164.338148665" watchObservedRunningTime="2025-10-06 16:36:07.191479727 +0000 UTC m=+6164.346772279" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.517416 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.520748 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.523829 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.524437 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.540909 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.575296 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:36:10 crc kubenswrapper[4763]: E1006 16:36:10.575632 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.677976 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-run-httpd\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.678039 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.678099 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-scripts\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.678131 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-config-data\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.678169 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.678204 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-log-httpd\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.678259 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgfh\" (UniqueName: \"kubernetes.io/projected/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-kube-api-access-7wgfh\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.780331 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-config-data\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.780445 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.780527 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-log-httpd\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.780799 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgfh\" (UniqueName: \"kubernetes.io/projected/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-kube-api-access-7wgfh\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.780990 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-run-httpd\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.781085 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.781212 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-scripts\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.781687 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-log-httpd\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.781846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-run-httpd\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.788535 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.788653 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-scripts\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.792821 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-config-data\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.801026 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgfh\" (UniqueName: \"kubernetes.io/projected/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-kube-api-access-7wgfh\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.803935 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " pod="openstack/ceilometer-0" Oct 06 16:36:10 crc kubenswrapper[4763]: I1006 16:36:10.837907 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 16:36:11 crc kubenswrapper[4763]: I1006 16:36:11.413689 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:36:11 crc kubenswrapper[4763]: W1006 16:36:11.418014 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12ee4c6f_66b1_43e8_96df_1a5b44b5261a.slice/crio-5cecddae9b5d4ee959a3cefc0a1949ebb1fb5479d99a859329cf643b867ec2ab WatchSource:0}: Error finding container 5cecddae9b5d4ee959a3cefc0a1949ebb1fb5479d99a859329cf643b867ec2ab: Status 404 returned error can't find the container with id 5cecddae9b5d4ee959a3cefc0a1949ebb1fb5479d99a859329cf643b867ec2ab Oct 06 16:36:11 crc kubenswrapper[4763]: I1006 16:36:11.927864 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 06 16:36:12 crc kubenswrapper[4763]: I1006 16:36:12.200947 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ee4c6f-66b1-43e8-96df-1a5b44b5261a","Type":"ContainerStarted","Data":"5cecddae9b5d4ee959a3cefc0a1949ebb1fb5479d99a859329cf643b867ec2ab"} Oct 06 16:36:13 crc kubenswrapper[4763]: I1006 16:36:13.214033 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ee4c6f-66b1-43e8-96df-1a5b44b5261a","Type":"ContainerStarted","Data":"67520a86211a566aeb011619d398a2fd16a506c7bf35fcd945ca7d8cc2257592"} Oct 06 16:36:13 crc kubenswrapper[4763]: I1006 16:36:13.214646 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ee4c6f-66b1-43e8-96df-1a5b44b5261a","Type":"ContainerStarted","Data":"33a6ce18e58449145b7703f3a4c6818fe484c7c0e48e266ab72a199cf0d84f7b"} Oct 06 16:36:14 crc kubenswrapper[4763]: I1006 16:36:14.238635 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ee4c6f-66b1-43e8-96df-1a5b44b5261a","Type":"ContainerStarted","Data":"8318693360e4b98f757d4d551647d5cd57bfce3979ca33c03981d8b8699b4cc9"} Oct 06 16:36:16 crc kubenswrapper[4763]: I1006 16:36:16.263329 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ee4c6f-66b1-43e8-96df-1a5b44b5261a","Type":"ContainerStarted","Data":"4dbe1db5ad85f72482a36fba1c1dad73d343fbf7dd6dce373209b1a60ef3113c"} Oct 06 16:36:16 crc kubenswrapper[4763]: I1006 16:36:16.265824 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 16:36:16 crc kubenswrapper[4763]: I1006 16:36:16.290340 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.230721611 podStartE2EDuration="6.290321223s" podCreationTimestamp="2025-10-06 16:36:10 +0000 UTC" firstStartedPulling="2025-10-06 16:36:11.421197701 +0000 UTC m=+6168.576490213" lastFinishedPulling="2025-10-06 16:36:15.480797313 +0000 UTC m=+6172.636089825" observedRunningTime="2025-10-06 16:36:16.289857151 +0000 UTC m=+6173.445149693" watchObservedRunningTime="2025-10-06 16:36:16.290321223 +0000 UTC m=+6173.445613735" Oct 06 16:36:18 crc kubenswrapper[4763]: I1006 16:36:18.718095 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-xqxl8"] Oct 06 16:36:18 crc kubenswrapper[4763]: I1006 16:36:18.719859 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-xqxl8" Oct 06 16:36:18 crc kubenswrapper[4763]: I1006 16:36:18.727476 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-xqxl8"] Oct 06 16:36:18 crc kubenswrapper[4763]: I1006 16:36:18.862364 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p27n8\" (UniqueName: \"kubernetes.io/projected/c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb-kube-api-access-p27n8\") pod \"aodh-db-create-xqxl8\" (UID: \"c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb\") " pod="openstack/aodh-db-create-xqxl8" Oct 06 16:36:18 crc kubenswrapper[4763]: I1006 16:36:18.964208 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p27n8\" (UniqueName: \"kubernetes.io/projected/c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb-kube-api-access-p27n8\") pod \"aodh-db-create-xqxl8\" (UID: \"c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb\") " pod="openstack/aodh-db-create-xqxl8" Oct 06 16:36:19 crc kubenswrapper[4763]: I1006 16:36:19.017361 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p27n8\" (UniqueName: \"kubernetes.io/projected/c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb-kube-api-access-p27n8\") pod \"aodh-db-create-xqxl8\" (UID: \"c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb\") " pod="openstack/aodh-db-create-xqxl8" Oct 06 16:36:19 crc kubenswrapper[4763]: I1006 16:36:19.072049 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-xqxl8" Oct 06 16:36:19 crc kubenswrapper[4763]: I1006 16:36:19.709809 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-xqxl8"] Oct 06 16:36:20 crc kubenswrapper[4763]: I1006 16:36:20.314089 4763 generic.go:334] "Generic (PLEG): container finished" podID="c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb" containerID="2e00963f59a29db40a3659d85b5ad8336eea032bb8ff52b5603432223da99cb4" exitCode=0 Oct 06 16:36:20 crc kubenswrapper[4763]: I1006 16:36:20.314172 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-xqxl8" event={"ID":"c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb","Type":"ContainerDied","Data":"2e00963f59a29db40a3659d85b5ad8336eea032bb8ff52b5603432223da99cb4"} Oct 06 16:36:20 crc kubenswrapper[4763]: I1006 16:36:20.315328 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-xqxl8" event={"ID":"c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb","Type":"ContainerStarted","Data":"1362ecf34329ceae9aeed973874a3dab4e32b407fb6639631c6bf62d727c3785"} Oct 06 16:36:21 crc kubenswrapper[4763]: I1006 16:36:21.733234 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-xqxl8" Oct 06 16:36:21 crc kubenswrapper[4763]: I1006 16:36:21.838859 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p27n8\" (UniqueName: \"kubernetes.io/projected/c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb-kube-api-access-p27n8\") pod \"c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb\" (UID: \"c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb\") " Oct 06 16:36:21 crc kubenswrapper[4763]: I1006 16:36:21.844268 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb-kube-api-access-p27n8" (OuterVolumeSpecName: "kube-api-access-p27n8") pod "c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb" (UID: "c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb"). InnerVolumeSpecName "kube-api-access-p27n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:36:21 crc kubenswrapper[4763]: I1006 16:36:21.928455 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 06 16:36:21 crc kubenswrapper[4763]: I1006 16:36:21.930424 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 06 16:36:21 crc kubenswrapper[4763]: I1006 16:36:21.940901 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p27n8\" (UniqueName: \"kubernetes.io/projected/c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb-kube-api-access-p27n8\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:22 crc kubenswrapper[4763]: I1006 16:36:22.335198 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-xqxl8" Oct 06 16:36:22 crc kubenswrapper[4763]: I1006 16:36:22.335226 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-xqxl8" event={"ID":"c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb","Type":"ContainerDied","Data":"1362ecf34329ceae9aeed973874a3dab4e32b407fb6639631c6bf62d727c3785"} Oct 06 16:36:22 crc kubenswrapper[4763]: I1006 16:36:22.335652 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1362ecf34329ceae9aeed973874a3dab4e32b407fb6639631c6bf62d727c3785" Oct 06 16:36:22 crc kubenswrapper[4763]: I1006 16:36:22.336708 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 06 16:36:23 crc kubenswrapper[4763]: I1006 16:36:23.029278 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-tx54f"] Oct 06 16:36:23 crc kubenswrapper[4763]: I1006 16:36:23.036892 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-tx54f"] Oct 06 16:36:23 crc kubenswrapper[4763]: I1006 16:36:23.591787 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16239947-411d-416c-be77-4b34624cfc2f" path="/var/lib/kubelet/pods/16239947-411d-416c-be77-4b34624cfc2f/volumes" Oct 06 16:36:24 crc kubenswrapper[4763]: I1006 16:36:24.575481 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:36:24 crc kubenswrapper[4763]: E1006 16:36:24.576241 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:36:28 crc kubenswrapper[4763]: I1006 16:36:28.853375 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-217d-account-create-9pkbx"] Oct 06 16:36:28 crc kubenswrapper[4763]: E1006 16:36:28.854370 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb" containerName="mariadb-database-create" Oct 06 16:36:28 crc kubenswrapper[4763]: I1006 16:36:28.854386 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb" containerName="mariadb-database-create" Oct 06 16:36:28 crc kubenswrapper[4763]: I1006 16:36:28.854695 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb" containerName="mariadb-database-create" Oct 06 16:36:28 crc kubenswrapper[4763]: I1006 16:36:28.855529 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-217d-account-create-9pkbx" Oct 06 16:36:28 crc kubenswrapper[4763]: I1006 16:36:28.858369 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 06 16:36:28 crc kubenswrapper[4763]: I1006 16:36:28.874861 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-217d-account-create-9pkbx"] Oct 06 16:36:29 crc kubenswrapper[4763]: I1006 16:36:29.008005 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5wsx\" (UniqueName: \"kubernetes.io/projected/9f306383-e72b-4e5a-a010-f4685d581d95-kube-api-access-n5wsx\") pod \"aodh-217d-account-create-9pkbx\" (UID: \"9f306383-e72b-4e5a-a010-f4685d581d95\") " pod="openstack/aodh-217d-account-create-9pkbx" Oct 06 16:36:29 crc kubenswrapper[4763]: I1006 16:36:29.110487 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5wsx\" (UniqueName: \"kubernetes.io/projected/9f306383-e72b-4e5a-a010-f4685d581d95-kube-api-access-n5wsx\") pod \"aodh-217d-account-create-9pkbx\" (UID: \"9f306383-e72b-4e5a-a010-f4685d581d95\") " pod="openstack/aodh-217d-account-create-9pkbx" Oct 06 16:36:29 crc kubenswrapper[4763]: I1006 16:36:29.139966 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5wsx\" (UniqueName: \"kubernetes.io/projected/9f306383-e72b-4e5a-a010-f4685d581d95-kube-api-access-n5wsx\") pod \"aodh-217d-account-create-9pkbx\" (UID: \"9f306383-e72b-4e5a-a010-f4685d581d95\") " pod="openstack/aodh-217d-account-create-9pkbx" Oct 06 16:36:29 crc kubenswrapper[4763]: I1006 16:36:29.180035 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-217d-account-create-9pkbx" Oct 06 16:36:29 crc kubenswrapper[4763]: I1006 16:36:29.684847 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-217d-account-create-9pkbx"] Oct 06 16:36:29 crc kubenswrapper[4763]: W1006 16:36:29.688516 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f306383_e72b_4e5a_a010_f4685d581d95.slice/crio-43ba17d8b03bacbe90fae6e9b7543762ea5dfa5f43f89ad8c94f541e29ba402c WatchSource:0}: Error finding container 43ba17d8b03bacbe90fae6e9b7543762ea5dfa5f43f89ad8c94f541e29ba402c: Status 404 returned error can't find the container with id 43ba17d8b03bacbe90fae6e9b7543762ea5dfa5f43f89ad8c94f541e29ba402c Oct 06 16:36:30 crc kubenswrapper[4763]: I1006 16:36:30.448074 4763 generic.go:334] "Generic (PLEG): container finished" podID="9f306383-e72b-4e5a-a010-f4685d581d95" containerID="33cc4a63bf00c0a7aa0c06650fdb78047b399b4705ef94328708ffd5783bb0f3" exitCode=0 Oct 06 16:36:30 crc kubenswrapper[4763]: I1006 16:36:30.448139 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-217d-account-create-9pkbx" event={"ID":"9f306383-e72b-4e5a-a010-f4685d581d95","Type":"ContainerDied","Data":"33cc4a63bf00c0a7aa0c06650fdb78047b399b4705ef94328708ffd5783bb0f3"} Oct 06 16:36:30 crc kubenswrapper[4763]: I1006 16:36:30.448349 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-217d-account-create-9pkbx" event={"ID":"9f306383-e72b-4e5a-a010-f4685d581d95","Type":"ContainerStarted","Data":"43ba17d8b03bacbe90fae6e9b7543762ea5dfa5f43f89ad8c94f541e29ba402c"} Oct 06 16:36:31 crc kubenswrapper[4763]: I1006 16:36:31.898912 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-217d-account-create-9pkbx" Oct 06 16:36:31 crc kubenswrapper[4763]: I1006 16:36:31.989911 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5wsx\" (UniqueName: \"kubernetes.io/projected/9f306383-e72b-4e5a-a010-f4685d581d95-kube-api-access-n5wsx\") pod \"9f306383-e72b-4e5a-a010-f4685d581d95\" (UID: \"9f306383-e72b-4e5a-a010-f4685d581d95\") " Oct 06 16:36:32 crc kubenswrapper[4763]: I1006 16:36:32.010925 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f306383-e72b-4e5a-a010-f4685d581d95-kube-api-access-n5wsx" (OuterVolumeSpecName: "kube-api-access-n5wsx") pod "9f306383-e72b-4e5a-a010-f4685d581d95" (UID: "9f306383-e72b-4e5a-a010-f4685d581d95"). InnerVolumeSpecName "kube-api-access-n5wsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:36:32 crc kubenswrapper[4763]: I1006 16:36:32.092224 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5wsx\" (UniqueName: \"kubernetes.io/projected/9f306383-e72b-4e5a-a010-f4685d581d95-kube-api-access-n5wsx\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:32 crc kubenswrapper[4763]: I1006 16:36:32.468418 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-217d-account-create-9pkbx" event={"ID":"9f306383-e72b-4e5a-a010-f4685d581d95","Type":"ContainerDied","Data":"43ba17d8b03bacbe90fae6e9b7543762ea5dfa5f43f89ad8c94f541e29ba402c"} Oct 06 16:36:32 crc kubenswrapper[4763]: I1006 16:36:32.468467 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43ba17d8b03bacbe90fae6e9b7543762ea5dfa5f43f89ad8c94f541e29ba402c" Oct 06 16:36:32 crc kubenswrapper[4763]: I1006 16:36:32.468494 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-217d-account-create-9pkbx" Oct 06 16:36:32 crc kubenswrapper[4763]: I1006 16:36:32.564184 4763 scope.go:117] "RemoveContainer" containerID="2fb93126c951c2d488f9aff1999eca5bb26a7a76260b06b3634316ee68445098" Oct 06 16:36:32 crc kubenswrapper[4763]: I1006 16:36:32.590578 4763 scope.go:117] "RemoveContainer" containerID="fef0a40dce06fb915401acb0a7473f371e68c456a8f45f689fb374e5fc4e9705" Oct 06 16:36:32 crc kubenswrapper[4763]: I1006 16:36:32.642730 4763 scope.go:117] "RemoveContainer" containerID="4698e2fa70bff7b5347c691d57977185f2cb888b59d8ceedde07630a89787c83" Oct 06 16:36:33 crc kubenswrapper[4763]: I1006 16:36:33.040999 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-bf6a-account-create-g9rrk"] Oct 06 16:36:33 crc kubenswrapper[4763]: I1006 16:36:33.056885 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-bf6a-account-create-g9rrk"] Oct 06 16:36:33 crc kubenswrapper[4763]: I1006 16:36:33.594917 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="952bb439-e564-4c01-9d6f-07b1e941434a" path="/var/lib/kubelet/pods/952bb439-e564-4c01-9d6f-07b1e941434a/volumes" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.393100 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-jd6sn"] Oct 06 16:36:34 crc kubenswrapper[4763]: E1006 16:36:34.394095 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f306383-e72b-4e5a-a010-f4685d581d95" containerName="mariadb-account-create" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.394117 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f306383-e72b-4e5a-a010-f4685d581d95" containerName="mariadb-account-create" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.394439 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f306383-e72b-4e5a-a010-f4685d581d95" containerName="mariadb-account-create" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.396003 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.401834 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.402613 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9jfqs" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.402910 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.423096 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-jd6sn"] Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.550932 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-config-data\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.550981 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-scripts\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.551015 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvs4m\" (UniqueName: \"kubernetes.io/projected/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-kube-api-access-wvs4m\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.551248 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-combined-ca-bundle\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.652842 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvs4m\" (UniqueName: \"kubernetes.io/projected/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-kube-api-access-wvs4m\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.653192 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-combined-ca-bundle\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.653441 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-config-data\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.654216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-scripts\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.658327 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-scripts\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.658728 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-combined-ca-bundle\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.661581 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-config-data\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.674114 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvs4m\" (UniqueName: \"kubernetes.io/projected/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-kube-api-access-wvs4m\") pod \"aodh-db-sync-jd6sn\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:34 crc kubenswrapper[4763]: I1006 16:36:34.732235 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:35 crc kubenswrapper[4763]: I1006 16:36:35.295775 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-jd6sn"] Oct 06 16:36:35 crc kubenswrapper[4763]: W1006 16:36:35.305485 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd58816bc_70e9_4dfa_93ea_fc53e07fa77e.slice/crio-6df385811f56ae4b8ad8256bc592bdfcce9e4748e414a8ebe66530a091789eae WatchSource:0}: Error finding container 6df385811f56ae4b8ad8256bc592bdfcce9e4748e414a8ebe66530a091789eae: Status 404 returned error can't find the container with id 6df385811f56ae4b8ad8256bc592bdfcce9e4748e414a8ebe66530a091789eae Oct 06 16:36:35 crc kubenswrapper[4763]: I1006 16:36:35.503572 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jd6sn" event={"ID":"d58816bc-70e9-4dfa-93ea-fc53e07fa77e","Type":"ContainerStarted","Data":"6df385811f56ae4b8ad8256bc592bdfcce9e4748e414a8ebe66530a091789eae"} Oct 06 16:36:36 crc kubenswrapper[4763]: I1006 16:36:36.576042 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:36:36 crc kubenswrapper[4763]: E1006 16:36:36.576443 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:36:40 crc kubenswrapper[4763]: I1006 16:36:40.039464 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-cnfnt"] Oct 06 16:36:40 crc kubenswrapper[4763]: I1006 16:36:40.051372 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-cnfnt"] Oct 06 16:36:40 crc kubenswrapper[4763]: I1006 16:36:40.568717 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jd6sn" event={"ID":"d58816bc-70e9-4dfa-93ea-fc53e07fa77e","Type":"ContainerStarted","Data":"375aa3e0556c8ac0e6eae5c5d158c399b33f77cedbd7772f523e9f72a2ab123d"} Oct 06 16:36:40 crc kubenswrapper[4763]: I1006 16:36:40.595127 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-jd6sn" podStartSLOduration=2.495855482 podStartE2EDuration="6.595103261s" podCreationTimestamp="2025-10-06 16:36:34 +0000 UTC" firstStartedPulling="2025-10-06 16:36:35.308358192 +0000 UTC m=+6192.463650704" lastFinishedPulling="2025-10-06 16:36:39.407605971 +0000 UTC m=+6196.562898483" observedRunningTime="2025-10-06 16:36:40.582271342 +0000 UTC m=+6197.737563894" watchObservedRunningTime="2025-10-06 16:36:40.595103261 +0000 UTC m=+6197.750395813" Oct 06 16:36:40 crc kubenswrapper[4763]: I1006 16:36:40.850532 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 16:36:41 crc kubenswrapper[4763]: I1006 16:36:41.596549 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801e07ad-1c28-4afe-a196-77098d825544" path="/var/lib/kubelet/pods/801e07ad-1c28-4afe-a196-77098d825544/volumes" Oct 06 16:36:42 crc kubenswrapper[4763]: I1006 16:36:42.590112 4763 generic.go:334] "Generic (PLEG): container finished" podID="d58816bc-70e9-4dfa-93ea-fc53e07fa77e" containerID="375aa3e0556c8ac0e6eae5c5d158c399b33f77cedbd7772f523e9f72a2ab123d" exitCode=0 Oct 06 16:36:42 crc kubenswrapper[4763]: I1006 16:36:42.590225 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jd6sn" event={"ID":"d58816bc-70e9-4dfa-93ea-fc53e07fa77e","Type":"ContainerDied","Data":"375aa3e0556c8ac0e6eae5c5d158c399b33f77cedbd7772f523e9f72a2ab123d"} Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.109563 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.203287 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvs4m\" (UniqueName: \"kubernetes.io/projected/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-kube-api-access-wvs4m\") pod \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.203360 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-config-data\") pod \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.203465 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-scripts\") pod \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.204335 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-combined-ca-bundle\") pod \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\" (UID: \"d58816bc-70e9-4dfa-93ea-fc53e07fa77e\") " Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.208298 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-scripts" (OuterVolumeSpecName: "scripts") pod "d58816bc-70e9-4dfa-93ea-fc53e07fa77e" (UID: "d58816bc-70e9-4dfa-93ea-fc53e07fa77e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.208916 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-kube-api-access-wvs4m" (OuterVolumeSpecName: "kube-api-access-wvs4m") pod "d58816bc-70e9-4dfa-93ea-fc53e07fa77e" (UID: "d58816bc-70e9-4dfa-93ea-fc53e07fa77e"). InnerVolumeSpecName "kube-api-access-wvs4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.236663 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d58816bc-70e9-4dfa-93ea-fc53e07fa77e" (UID: "d58816bc-70e9-4dfa-93ea-fc53e07fa77e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.240668 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-config-data" (OuterVolumeSpecName: "config-data") pod "d58816bc-70e9-4dfa-93ea-fc53e07fa77e" (UID: "d58816bc-70e9-4dfa-93ea-fc53e07fa77e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.308199 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.308239 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.308251 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvs4m\" (UniqueName: \"kubernetes.io/projected/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-kube-api-access-wvs4m\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.308259 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d58816bc-70e9-4dfa-93ea-fc53e07fa77e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.610006 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jd6sn" event={"ID":"d58816bc-70e9-4dfa-93ea-fc53e07fa77e","Type":"ContainerDied","Data":"6df385811f56ae4b8ad8256bc592bdfcce9e4748e414a8ebe66530a091789eae"} Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.610053 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6df385811f56ae4b8ad8256bc592bdfcce9e4748e414a8ebe66530a091789eae" Oct 06 16:36:44 crc kubenswrapper[4763]: I1006 16:36:44.610111 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jd6sn" Oct 06 16:36:48 crc kubenswrapper[4763]: I1006 16:36:48.981026 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 06 16:36:48 crc kubenswrapper[4763]: E1006 16:36:48.981890 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58816bc-70e9-4dfa-93ea-fc53e07fa77e" containerName="aodh-db-sync" Oct 06 16:36:48 crc kubenswrapper[4763]: I1006 16:36:48.981908 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58816bc-70e9-4dfa-93ea-fc53e07fa77e" containerName="aodh-db-sync" Oct 06 16:36:48 crc kubenswrapper[4763]: I1006 16:36:48.982145 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58816bc-70e9-4dfa-93ea-fc53e07fa77e" containerName="aodh-db-sync" Oct 06 16:36:48 crc kubenswrapper[4763]: I1006 16:36:48.984703 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 06 16:36:48 crc kubenswrapper[4763]: I1006 16:36:48.989131 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9jfqs" Oct 06 16:36:48 crc kubenswrapper[4763]: I1006 16:36:48.989131 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 06 16:36:48 crc kubenswrapper[4763]: I1006 16:36:48.989131 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.021125 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.026019 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frpr7\" (UniqueName: \"kubernetes.io/projected/019b83d2-a0e4-439d-8df3-f62c047400fb-kube-api-access-frpr7\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.026153 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019b83d2-a0e4-439d-8df3-f62c047400fb-scripts\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.026197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019b83d2-a0e4-439d-8df3-f62c047400fb-config-data\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.026278 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019b83d2-a0e4-439d-8df3-f62c047400fb-combined-ca-bundle\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.132092 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019b83d2-a0e4-439d-8df3-f62c047400fb-combined-ca-bundle\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.132183 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frpr7\" (UniqueName: \"kubernetes.io/projected/019b83d2-a0e4-439d-8df3-f62c047400fb-kube-api-access-frpr7\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.132292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019b83d2-a0e4-439d-8df3-f62c047400fb-scripts\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.132318 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019b83d2-a0e4-439d-8df3-f62c047400fb-config-data\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.149632 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019b83d2-a0e4-439d-8df3-f62c047400fb-config-data\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.153808 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019b83d2-a0e4-439d-8df3-f62c047400fb-combined-ca-bundle\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.155061 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019b83d2-a0e4-439d-8df3-f62c047400fb-scripts\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.159558 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frpr7\" (UniqueName: \"kubernetes.io/projected/019b83d2-a0e4-439d-8df3-f62c047400fb-kube-api-access-frpr7\") pod \"aodh-0\" (UID: \"019b83d2-a0e4-439d-8df3-f62c047400fb\") " pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.324870 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 06 16:36:49 crc kubenswrapper[4763]: I1006 16:36:49.852840 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 06 16:36:50 crc kubenswrapper[4763]: I1006 16:36:50.576188 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:36:50 crc kubenswrapper[4763]: E1006 16:36:50.576757 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:36:50 crc kubenswrapper[4763]: I1006 16:36:50.691366 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"019b83d2-a0e4-439d-8df3-f62c047400fb","Type":"ContainerStarted","Data":"0fa8cd79ef0f39414aa3b4f0bda3c42e50e0f9f6889747e47fa88f146d0ae17d"} Oct 06 16:36:51 crc kubenswrapper[4763]: I1006 16:36:51.547745 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:36:51 crc kubenswrapper[4763]: I1006 16:36:51.548182 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="ceilometer-central-agent" containerID="cri-o://33a6ce18e58449145b7703f3a4c6818fe484c7c0e48e266ab72a199cf0d84f7b" gracePeriod=30 Oct 06 16:36:51 crc kubenswrapper[4763]: I1006 16:36:51.548264 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="sg-core" containerID="cri-o://8318693360e4b98f757d4d551647d5cd57bfce3979ca33c03981d8b8699b4cc9" gracePeriod=30 Oct 06 16:36:51 crc kubenswrapper[4763]: I1006 16:36:51.548294 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="proxy-httpd" containerID="cri-o://4dbe1db5ad85f72482a36fba1c1dad73d343fbf7dd6dce373209b1a60ef3113c" gracePeriod=30 Oct 06 16:36:51 crc kubenswrapper[4763]: I1006 16:36:51.548292 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="ceilometer-notification-agent" containerID="cri-o://67520a86211a566aeb011619d398a2fd16a506c7bf35fcd945ca7d8cc2257592" gracePeriod=30 Oct 06 16:36:51 crc kubenswrapper[4763]: I1006 16:36:51.701008 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"019b83d2-a0e4-439d-8df3-f62c047400fb","Type":"ContainerStarted","Data":"59f30519f96df79f540ee6f18c238cfb1d7b59b4285fb11601efb99203a5fe0c"} Oct 06 16:36:51 crc kubenswrapper[4763]: I1006 16:36:51.703777 4763 generic.go:334] "Generic (PLEG): container finished" podID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerID="4dbe1db5ad85f72482a36fba1c1dad73d343fbf7dd6dce373209b1a60ef3113c" exitCode=0 Oct 06 16:36:51 crc kubenswrapper[4763]: I1006 16:36:51.703800 4763 generic.go:334] "Generic (PLEG): container finished" podID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerID="8318693360e4b98f757d4d551647d5cd57bfce3979ca33c03981d8b8699b4cc9" exitCode=2 Oct 06 16:36:51 crc kubenswrapper[4763]: I1006 16:36:51.703812 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ee4c6f-66b1-43e8-96df-1a5b44b5261a","Type":"ContainerDied","Data":"4dbe1db5ad85f72482a36fba1c1dad73d343fbf7dd6dce373209b1a60ef3113c"} Oct 06 16:36:51 crc kubenswrapper[4763]: I1006 16:36:51.703827 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ee4c6f-66b1-43e8-96df-1a5b44b5261a","Type":"ContainerDied","Data":"8318693360e4b98f757d4d551647d5cd57bfce3979ca33c03981d8b8699b4cc9"} Oct 06 16:36:52 crc kubenswrapper[4763]: I1006 16:36:52.717371 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"019b83d2-a0e4-439d-8df3-f62c047400fb","Type":"ContainerStarted","Data":"defaf801a26b4f07c606f80e26cc204ad675d05d3596c262ed7e61154c828bc1"} Oct 06 16:36:52 crc kubenswrapper[4763]: I1006 16:36:52.723257 4763 generic.go:334] "Generic (PLEG): container finished" podID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerID="33a6ce18e58449145b7703f3a4c6818fe484c7c0e48e266ab72a199cf0d84f7b" exitCode=0 Oct 06 16:36:52 crc kubenswrapper[4763]: I1006 16:36:52.723305 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ee4c6f-66b1-43e8-96df-1a5b44b5261a","Type":"ContainerDied","Data":"33a6ce18e58449145b7703f3a4c6818fe484c7c0e48e266ab72a199cf0d84f7b"} Oct 06 16:36:54 crc kubenswrapper[4763]: I1006 16:36:54.762712 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"019b83d2-a0e4-439d-8df3-f62c047400fb","Type":"ContainerStarted","Data":"f0bdde49f84f63164720ed536cf7f3737b7f36d75d30887e506fe2ca107a57a0"} Oct 06 16:36:55 crc kubenswrapper[4763]: I1006 16:36:55.774498 4763 generic.go:334] "Generic (PLEG): container finished" podID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerID="67520a86211a566aeb011619d398a2fd16a506c7bf35fcd945ca7d8cc2257592" exitCode=0 Oct 06 16:36:55 crc kubenswrapper[4763]: I1006 16:36:55.774982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ee4c6f-66b1-43e8-96df-1a5b44b5261a","Type":"ContainerDied","Data":"67520a86211a566aeb011619d398a2fd16a506c7bf35fcd945ca7d8cc2257592"} Oct 06 16:36:55 crc kubenswrapper[4763]: I1006 16:36:55.793931 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"019b83d2-a0e4-439d-8df3-f62c047400fb","Type":"ContainerStarted","Data":"bffc19dbb188551e22883e177a2b4d2b31707560d22ebc5a3634a8c72aa83fc3"} Oct 06 16:36:55 crc kubenswrapper[4763]: I1006 16:36:55.827031 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.372446628 podStartE2EDuration="7.827006435s" podCreationTimestamp="2025-10-06 16:36:48 +0000 UTC" firstStartedPulling="2025-10-06 16:36:49.853939243 +0000 UTC m=+6207.009231755" lastFinishedPulling="2025-10-06 16:36:55.30849905 +0000 UTC m=+6212.463791562" observedRunningTime="2025-10-06 16:36:55.81354995 +0000 UTC m=+6212.968842462" watchObservedRunningTime="2025-10-06 16:36:55.827006435 +0000 UTC m=+6212.982298957" Oct 06 16:36:55 crc kubenswrapper[4763]: I1006 16:36:55.909981 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.094807 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-combined-ca-bundle\") pod \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.094873 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-run-httpd\") pod \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.094954 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-config-data\") pod \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.095016 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-scripts\") pod \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.095037 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-log-httpd\") pod \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.095117 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wgfh\" (UniqueName: \"kubernetes.io/projected/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-kube-api-access-7wgfh\") pod \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.095144 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-sg-core-conf-yaml\") pod \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\" (UID: \"12ee4c6f-66b1-43e8-96df-1a5b44b5261a\") " Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.096786 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "12ee4c6f-66b1-43e8-96df-1a5b44b5261a" (UID: "12ee4c6f-66b1-43e8-96df-1a5b44b5261a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.098279 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "12ee4c6f-66b1-43e8-96df-1a5b44b5261a" (UID: "12ee4c6f-66b1-43e8-96df-1a5b44b5261a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.100555 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-scripts" (OuterVolumeSpecName: "scripts") pod "12ee4c6f-66b1-43e8-96df-1a5b44b5261a" (UID: "12ee4c6f-66b1-43e8-96df-1a5b44b5261a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.102815 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-kube-api-access-7wgfh" (OuterVolumeSpecName: "kube-api-access-7wgfh") pod "12ee4c6f-66b1-43e8-96df-1a5b44b5261a" (UID: "12ee4c6f-66b1-43e8-96df-1a5b44b5261a"). InnerVolumeSpecName "kube-api-access-7wgfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.141629 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "12ee4c6f-66b1-43e8-96df-1a5b44b5261a" (UID: "12ee4c6f-66b1-43e8-96df-1a5b44b5261a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.175645 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12ee4c6f-66b1-43e8-96df-1a5b44b5261a" (UID: "12ee4c6f-66b1-43e8-96df-1a5b44b5261a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.200541 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.200568 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.200578 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wgfh\" (UniqueName: \"kubernetes.io/projected/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-kube-api-access-7wgfh\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.200590 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.200600 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.200645 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.207372 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-config-data" (OuterVolumeSpecName: "config-data") pod "12ee4c6f-66b1-43e8-96df-1a5b44b5261a" (UID: "12ee4c6f-66b1-43e8-96df-1a5b44b5261a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.302818 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ee4c6f-66b1-43e8-96df-1a5b44b5261a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.809141 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ee4c6f-66b1-43e8-96df-1a5b44b5261a","Type":"ContainerDied","Data":"5cecddae9b5d4ee959a3cefc0a1949ebb1fb5479d99a859329cf643b867ec2ab"} Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.809216 4763 scope.go:117] "RemoveContainer" containerID="4dbe1db5ad85f72482a36fba1c1dad73d343fbf7dd6dce373209b1a60ef3113c" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.809161 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.844706 4763 scope.go:117] "RemoveContainer" containerID="8318693360e4b98f757d4d551647d5cd57bfce3979ca33c03981d8b8699b4cc9" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.880367 4763 scope.go:117] "RemoveContainer" containerID="67520a86211a566aeb011619d398a2fd16a506c7bf35fcd945ca7d8cc2257592" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.890205 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.903105 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.911237 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.911804 4763 scope.go:117] "RemoveContainer" containerID="33a6ce18e58449145b7703f3a4c6818fe484c7c0e48e266ab72a199cf0d84f7b" Oct 06 16:36:56 crc kubenswrapper[4763]: E1006 16:36:56.912071 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="sg-core" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.912090 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="sg-core" Oct 06 16:36:56 crc kubenswrapper[4763]: E1006 16:36:56.912127 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="ceilometer-notification-agent" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.912134 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="ceilometer-notification-agent" Oct 06 16:36:56 crc kubenswrapper[4763]: E1006 16:36:56.912240 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="proxy-httpd" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.912254 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="proxy-httpd" Oct 06 16:36:56 crc kubenswrapper[4763]: E1006 16:36:56.912306 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="ceilometer-central-agent" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.912313 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="ceilometer-central-agent" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.913429 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="ceilometer-central-agent" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.913452 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="sg-core" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.913465 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="proxy-httpd" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.913483 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" containerName="ceilometer-notification-agent" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.915483 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.917520 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.918742 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 16:36:56 crc kubenswrapper[4763]: I1006 16:36:56.919422 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.016266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67wnh\" (UniqueName: \"kubernetes.io/projected/711d3e7e-7763-4236-b7b8-6b11f32eb091-kube-api-access-67wnh\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.016343 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-log-httpd\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.016541 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.016689 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.016884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-run-httpd\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.016983 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-config-data\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.017109 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-scripts\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.119545 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-run-httpd\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.119678 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-config-data\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.119782 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-scripts\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.119844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67wnh\" (UniqueName: \"kubernetes.io/projected/711d3e7e-7763-4236-b7b8-6b11f32eb091-kube-api-access-67wnh\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.120320 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-log-httpd\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.120411 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.120450 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.120552 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-log-httpd\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.120726 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-run-httpd\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.125568 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-scripts\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.126564 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-config-data\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.128000 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.138323 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67wnh\" (UniqueName: \"kubernetes.io/projected/711d3e7e-7763-4236-b7b8-6b11f32eb091-kube-api-access-67wnh\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.141691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.236787 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.588687 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ee4c6f-66b1-43e8-96df-1a5b44b5261a" path="/var/lib/kubelet/pods/12ee4c6f-66b1-43e8-96df-1a5b44b5261a/volumes" Oct 06 16:36:57 crc kubenswrapper[4763]: I1006 16:36:57.862058 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:36:58 crc kubenswrapper[4763]: I1006 16:36:58.829789 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"711d3e7e-7763-4236-b7b8-6b11f32eb091","Type":"ContainerStarted","Data":"b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a"} Oct 06 16:36:58 crc kubenswrapper[4763]: I1006 16:36:58.830084 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"711d3e7e-7763-4236-b7b8-6b11f32eb091","Type":"ContainerStarted","Data":"097f82806b380d4830add2b970742ebfc6b72f5dc29c3ee67640183ba8fc48b5"} Oct 06 16:36:59 crc kubenswrapper[4763]: I1006 16:36:59.843344 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"711d3e7e-7763-4236-b7b8-6b11f32eb091","Type":"ContainerStarted","Data":"78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2"} Oct 06 16:37:00 crc kubenswrapper[4763]: I1006 16:37:00.856601 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"711d3e7e-7763-4236-b7b8-6b11f32eb091","Type":"ContainerStarted","Data":"50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5"} Oct 06 16:37:01 crc kubenswrapper[4763]: I1006 16:37:01.610349 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-xf4dm"] Oct 06 16:37:01 crc kubenswrapper[4763]: I1006 16:37:01.612038 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xf4dm" Oct 06 16:37:01 crc kubenswrapper[4763]: I1006 16:37:01.621816 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-xf4dm"] Oct 06 16:37:01 crc kubenswrapper[4763]: I1006 16:37:01.636147 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mpnl\" (UniqueName: \"kubernetes.io/projected/d75a0351-94d8-4e4f-9ea0-74ccc0b74937-kube-api-access-5mpnl\") pod \"manila-db-create-xf4dm\" (UID: \"d75a0351-94d8-4e4f-9ea0-74ccc0b74937\") " pod="openstack/manila-db-create-xf4dm" Oct 06 16:37:01 crc kubenswrapper[4763]: I1006 16:37:01.738125 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mpnl\" (UniqueName: \"kubernetes.io/projected/d75a0351-94d8-4e4f-9ea0-74ccc0b74937-kube-api-access-5mpnl\") pod \"manila-db-create-xf4dm\" (UID: \"d75a0351-94d8-4e4f-9ea0-74ccc0b74937\") " pod="openstack/manila-db-create-xf4dm" Oct 06 16:37:01 crc kubenswrapper[4763]: I1006 16:37:01.756728 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mpnl\" (UniqueName: \"kubernetes.io/projected/d75a0351-94d8-4e4f-9ea0-74ccc0b74937-kube-api-access-5mpnl\") pod \"manila-db-create-xf4dm\" (UID: \"d75a0351-94d8-4e4f-9ea0-74ccc0b74937\") " pod="openstack/manila-db-create-xf4dm" Oct 06 16:37:01 crc kubenswrapper[4763]: I1006 16:37:01.930502 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xf4dm" Oct 06 16:37:02 crc kubenswrapper[4763]: I1006 16:37:02.451034 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-xf4dm"] Oct 06 16:37:02 crc kubenswrapper[4763]: I1006 16:37:02.575158 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:37:02 crc kubenswrapper[4763]: E1006 16:37:02.575429 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:37:02 crc kubenswrapper[4763]: I1006 16:37:02.874175 4763 generic.go:334] "Generic (PLEG): container finished" podID="d75a0351-94d8-4e4f-9ea0-74ccc0b74937" containerID="b51009a86e9f1c203b046d0b42717d6b63ba05e88ddd87225deb993660c3b70b" exitCode=0 Oct 06 16:37:02 crc kubenswrapper[4763]: I1006 16:37:02.874424 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-xf4dm" event={"ID":"d75a0351-94d8-4e4f-9ea0-74ccc0b74937","Type":"ContainerDied","Data":"b51009a86e9f1c203b046d0b42717d6b63ba05e88ddd87225deb993660c3b70b"} Oct 06 16:37:02 crc kubenswrapper[4763]: I1006 16:37:02.874544 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-xf4dm" event={"ID":"d75a0351-94d8-4e4f-9ea0-74ccc0b74937","Type":"ContainerStarted","Data":"35fec314e4bbcb5e5ef1d7b9bb9e1ab1f25c17c4eba37425b07d3399796e1b45"} Oct 06 16:37:02 crc kubenswrapper[4763]: I1006 16:37:02.877513 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"711d3e7e-7763-4236-b7b8-6b11f32eb091","Type":"ContainerStarted","Data":"56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82"} Oct 06 16:37:02 crc kubenswrapper[4763]: I1006 16:37:02.878601 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 16:37:02 crc kubenswrapper[4763]: I1006 16:37:02.940277 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.837795165 podStartE2EDuration="6.940258432s" podCreationTimestamp="2025-10-06 16:36:56 +0000 UTC" firstStartedPulling="2025-10-06 16:36:57.868452093 +0000 UTC m=+6215.023744635" lastFinishedPulling="2025-10-06 16:37:01.9709154 +0000 UTC m=+6219.126207902" observedRunningTime="2025-10-06 16:37:02.937012504 +0000 UTC m=+6220.092305026" watchObservedRunningTime="2025-10-06 16:37:02.940258432 +0000 UTC m=+6220.095550944" Oct 06 16:37:04 crc kubenswrapper[4763]: I1006 16:37:04.366830 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xf4dm" Oct 06 16:37:04 crc kubenswrapper[4763]: I1006 16:37:04.408247 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mpnl\" (UniqueName: \"kubernetes.io/projected/d75a0351-94d8-4e4f-9ea0-74ccc0b74937-kube-api-access-5mpnl\") pod \"d75a0351-94d8-4e4f-9ea0-74ccc0b74937\" (UID: \"d75a0351-94d8-4e4f-9ea0-74ccc0b74937\") " Oct 06 16:37:04 crc kubenswrapper[4763]: I1006 16:37:04.416008 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d75a0351-94d8-4e4f-9ea0-74ccc0b74937-kube-api-access-5mpnl" (OuterVolumeSpecName: "kube-api-access-5mpnl") pod "d75a0351-94d8-4e4f-9ea0-74ccc0b74937" (UID: "d75a0351-94d8-4e4f-9ea0-74ccc0b74937"). InnerVolumeSpecName "kube-api-access-5mpnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:37:04 crc kubenswrapper[4763]: I1006 16:37:04.517021 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mpnl\" (UniqueName: \"kubernetes.io/projected/d75a0351-94d8-4e4f-9ea0-74ccc0b74937-kube-api-access-5mpnl\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:04 crc kubenswrapper[4763]: I1006 16:37:04.903877 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-xf4dm" event={"ID":"d75a0351-94d8-4e4f-9ea0-74ccc0b74937","Type":"ContainerDied","Data":"35fec314e4bbcb5e5ef1d7b9bb9e1ab1f25c17c4eba37425b07d3399796e1b45"} Oct 06 16:37:04 crc kubenswrapper[4763]: I1006 16:37:04.904214 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35fec314e4bbcb5e5ef1d7b9bb9e1ab1f25c17c4eba37425b07d3399796e1b45" Oct 06 16:37:04 crc kubenswrapper[4763]: I1006 16:37:04.904269 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xf4dm" Oct 06 16:37:11 crc kubenswrapper[4763]: I1006 16:37:11.740296 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-6aa4-account-create-77nz4"] Oct 06 16:37:11 crc kubenswrapper[4763]: E1006 16:37:11.741998 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75a0351-94d8-4e4f-9ea0-74ccc0b74937" containerName="mariadb-database-create" Oct 06 16:37:11 crc kubenswrapper[4763]: I1006 16:37:11.742035 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75a0351-94d8-4e4f-9ea0-74ccc0b74937" containerName="mariadb-database-create" Oct 06 16:37:11 crc kubenswrapper[4763]: I1006 16:37:11.742683 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75a0351-94d8-4e4f-9ea0-74ccc0b74937" containerName="mariadb-database-create" Oct 06 16:37:11 crc kubenswrapper[4763]: I1006 16:37:11.744417 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6aa4-account-create-77nz4" Oct 06 16:37:11 crc kubenswrapper[4763]: I1006 16:37:11.746330 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 06 16:37:11 crc kubenswrapper[4763]: I1006 16:37:11.755655 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-6aa4-account-create-77nz4"] Oct 06 16:37:11 crc kubenswrapper[4763]: I1006 16:37:11.811289 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttktm\" (UniqueName: \"kubernetes.io/projected/49af2895-6886-419f-84c0-e92cd4196c4f-kube-api-access-ttktm\") pod \"manila-6aa4-account-create-77nz4\" (UID: \"49af2895-6886-419f-84c0-e92cd4196c4f\") " pod="openstack/manila-6aa4-account-create-77nz4" Oct 06 16:37:11 crc kubenswrapper[4763]: I1006 16:37:11.913825 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttktm\" (UniqueName: \"kubernetes.io/projected/49af2895-6886-419f-84c0-e92cd4196c4f-kube-api-access-ttktm\") pod \"manila-6aa4-account-create-77nz4\" (UID: \"49af2895-6886-419f-84c0-e92cd4196c4f\") " pod="openstack/manila-6aa4-account-create-77nz4" Oct 06 16:37:11 crc kubenswrapper[4763]: I1006 16:37:11.938537 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttktm\" (UniqueName: \"kubernetes.io/projected/49af2895-6886-419f-84c0-e92cd4196c4f-kube-api-access-ttktm\") pod \"manila-6aa4-account-create-77nz4\" (UID: \"49af2895-6886-419f-84c0-e92cd4196c4f\") " pod="openstack/manila-6aa4-account-create-77nz4" Oct 06 16:37:12 crc kubenswrapper[4763]: I1006 16:37:12.081670 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6aa4-account-create-77nz4" Oct 06 16:37:12 crc kubenswrapper[4763]: I1006 16:37:12.624865 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-6aa4-account-create-77nz4"] Oct 06 16:37:12 crc kubenswrapper[4763]: I1006 16:37:12.991003 4763 generic.go:334] "Generic (PLEG): container finished" podID="49af2895-6886-419f-84c0-e92cd4196c4f" containerID="c4accdecbc119e688ba398fb2853dcda6feca22aa6328c6351255af0ccd9211a" exitCode=0 Oct 06 16:37:12 crc kubenswrapper[4763]: I1006 16:37:12.991143 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-6aa4-account-create-77nz4" event={"ID":"49af2895-6886-419f-84c0-e92cd4196c4f","Type":"ContainerDied","Data":"c4accdecbc119e688ba398fb2853dcda6feca22aa6328c6351255af0ccd9211a"} Oct 06 16:37:12 crc kubenswrapper[4763]: I1006 16:37:12.991300 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-6aa4-account-create-77nz4" event={"ID":"49af2895-6886-419f-84c0-e92cd4196c4f","Type":"ContainerStarted","Data":"382c77f4327b731836252bce08a40b283f63c253c5f81c32b7cef1a02c9061cf"} Oct 06 16:37:14 crc kubenswrapper[4763]: I1006 16:37:14.413239 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6aa4-account-create-77nz4" Oct 06 16:37:14 crc kubenswrapper[4763]: I1006 16:37:14.474235 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttktm\" (UniqueName: \"kubernetes.io/projected/49af2895-6886-419f-84c0-e92cd4196c4f-kube-api-access-ttktm\") pod \"49af2895-6886-419f-84c0-e92cd4196c4f\" (UID: \"49af2895-6886-419f-84c0-e92cd4196c4f\") " Oct 06 16:37:14 crc kubenswrapper[4763]: I1006 16:37:14.481586 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49af2895-6886-419f-84c0-e92cd4196c4f-kube-api-access-ttktm" (OuterVolumeSpecName: "kube-api-access-ttktm") pod "49af2895-6886-419f-84c0-e92cd4196c4f" (UID: "49af2895-6886-419f-84c0-e92cd4196c4f"). InnerVolumeSpecName "kube-api-access-ttktm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:37:14 crc kubenswrapper[4763]: I1006 16:37:14.575744 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttktm\" (UniqueName: \"kubernetes.io/projected/49af2895-6886-419f-84c0-e92cd4196c4f-kube-api-access-ttktm\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:15 crc kubenswrapper[4763]: I1006 16:37:15.020431 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-6aa4-account-create-77nz4" event={"ID":"49af2895-6886-419f-84c0-e92cd4196c4f","Type":"ContainerDied","Data":"382c77f4327b731836252bce08a40b283f63c253c5f81c32b7cef1a02c9061cf"} Oct 06 16:37:15 crc kubenswrapper[4763]: I1006 16:37:15.020473 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="382c77f4327b731836252bce08a40b283f63c253c5f81c32b7cef1a02c9061cf" Oct 06 16:37:15 crc kubenswrapper[4763]: I1006 16:37:15.020548 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6aa4-account-create-77nz4" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.104107 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-x976s"] Oct 06 16:37:17 crc kubenswrapper[4763]: E1006 16:37:17.105426 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49af2895-6886-419f-84c0-e92cd4196c4f" containerName="mariadb-account-create" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.105459 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="49af2895-6886-419f-84c0-e92cd4196c4f" containerName="mariadb-account-create" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.105782 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="49af2895-6886-419f-84c0-e92cd4196c4f" containerName="mariadb-account-create" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.107335 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.112156 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-4ctd7" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.112582 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.128127 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-x976s"] Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.133195 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-combined-ca-bundle\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.133293 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-job-config-data\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.133377 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-config-data\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.133419 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgpcr\" (UniqueName: \"kubernetes.io/projected/6eccda7d-b9ac-464b-bf97-3281dca23b6c-kube-api-access-zgpcr\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.235703 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-combined-ca-bundle\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.235792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-job-config-data\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.235887 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-config-data\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.235934 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgpcr\" (UniqueName: \"kubernetes.io/projected/6eccda7d-b9ac-464b-bf97-3281dca23b6c-kube-api-access-zgpcr\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.243359 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-combined-ca-bundle\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.243490 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-config-data\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.249094 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-job-config-data\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.261436 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgpcr\" (UniqueName: \"kubernetes.io/projected/6eccda7d-b9ac-464b-bf97-3281dca23b6c-kube-api-access-zgpcr\") pod \"manila-db-sync-x976s\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.496805 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-x976s" Oct 06 16:37:17 crc kubenswrapper[4763]: I1006 16:37:17.577492 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:37:17 crc kubenswrapper[4763]: E1006 16:37:17.578344 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:37:18 crc kubenswrapper[4763]: I1006 16:37:18.586115 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-x976s"] Oct 06 16:37:19 crc kubenswrapper[4763]: I1006 16:37:19.063144 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-x976s" event={"ID":"6eccda7d-b9ac-464b-bf97-3281dca23b6c","Type":"ContainerStarted","Data":"85f3aa307aed019f8d6b1c183875192d970e6c6a6251c6a0c070b37709ef3b49"} Oct 06 16:37:24 crc kubenswrapper[4763]: I1006 16:37:24.133350 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-x976s" event={"ID":"6eccda7d-b9ac-464b-bf97-3281dca23b6c","Type":"ContainerStarted","Data":"b9e96b884f2f1dcfc8a175f97aa0f8cc90a16ff704b2a8541031e75ef26b3f82"} Oct 06 16:37:24 crc kubenswrapper[4763]: I1006 16:37:24.166952 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-x976s" podStartSLOduration=2.871962315 podStartE2EDuration="7.16692878s" podCreationTimestamp="2025-10-06 16:37:17 +0000 UTC" firstStartedPulling="2025-10-06 16:37:18.592032804 +0000 UTC m=+6235.747325316" lastFinishedPulling="2025-10-06 16:37:22.886999259 +0000 UTC m=+6240.042291781" observedRunningTime="2025-10-06 16:37:24.164301909 +0000 UTC m=+6241.319594411" watchObservedRunningTime="2025-10-06 16:37:24.16692878 +0000 UTC m=+6241.322221292" Oct 06 16:37:26 crc kubenswrapper[4763]: I1006 16:37:26.157178 4763 generic.go:334] "Generic (PLEG): container finished" podID="6eccda7d-b9ac-464b-bf97-3281dca23b6c" containerID="b9e96b884f2f1dcfc8a175f97aa0f8cc90a16ff704b2a8541031e75ef26b3f82" exitCode=0 Oct 06 16:37:26 crc kubenswrapper[4763]: I1006 16:37:26.157222 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-x976s" event={"ID":"6eccda7d-b9ac-464b-bf97-3281dca23b6c","Type":"ContainerDied","Data":"b9e96b884f2f1dcfc8a175f97aa0f8cc90a16ff704b2a8541031e75ef26b3f82"} Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.248808 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.712014 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-x976s" Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.827696 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-combined-ca-bundle\") pod \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.827763 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-job-config-data\") pod \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.827828 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-config-data\") pod \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.827865 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgpcr\" (UniqueName: \"kubernetes.io/projected/6eccda7d-b9ac-464b-bf97-3281dca23b6c-kube-api-access-zgpcr\") pod \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\" (UID: \"6eccda7d-b9ac-464b-bf97-3281dca23b6c\") " Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.836031 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "6eccda7d-b9ac-464b-bf97-3281dca23b6c" (UID: "6eccda7d-b9ac-464b-bf97-3281dca23b6c"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.836235 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eccda7d-b9ac-464b-bf97-3281dca23b6c-kube-api-access-zgpcr" (OuterVolumeSpecName: "kube-api-access-zgpcr") pod "6eccda7d-b9ac-464b-bf97-3281dca23b6c" (UID: "6eccda7d-b9ac-464b-bf97-3281dca23b6c"). InnerVolumeSpecName "kube-api-access-zgpcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.841071 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-config-data" (OuterVolumeSpecName: "config-data") pod "6eccda7d-b9ac-464b-bf97-3281dca23b6c" (UID: "6eccda7d-b9ac-464b-bf97-3281dca23b6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.857930 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6eccda7d-b9ac-464b-bf97-3281dca23b6c" (UID: "6eccda7d-b9ac-464b-bf97-3281dca23b6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.930588 4763 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.931035 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.931116 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgpcr\" (UniqueName: \"kubernetes.io/projected/6eccda7d-b9ac-464b-bf97-3281dca23b6c-kube-api-access-zgpcr\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:27 crc kubenswrapper[4763]: I1006 16:37:27.931169 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eccda7d-b9ac-464b-bf97-3281dca23b6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.180692 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-x976s" event={"ID":"6eccda7d-b9ac-464b-bf97-3281dca23b6c","Type":"ContainerDied","Data":"85f3aa307aed019f8d6b1c183875192d970e6c6a6251c6a0c070b37709ef3b49"} Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.180738 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f3aa307aed019f8d6b1c183875192d970e6c6a6251c6a0c070b37709ef3b49" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.180744 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-x976s" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.528311 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 16:37:28 crc kubenswrapper[4763]: E1006 16:37:28.528791 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eccda7d-b9ac-464b-bf97-3281dca23b6c" containerName="manila-db-sync" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.528803 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eccda7d-b9ac-464b-bf97-3281dca23b6c" containerName="manila-db-sync" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.529033 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eccda7d-b9ac-464b-bf97-3281dca23b6c" containerName="manila-db-sync" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.530112 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.536354 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.536602 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.536788 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.536883 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-4ctd7" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.550383 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.553032 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.556714 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.559428 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.571996 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.624439 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-7m5cq"] Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.626826 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.632335 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-7m5cq"] Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.646924 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.647147 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-scripts\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.647248 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7lmg\" (UniqueName: \"kubernetes.io/projected/9af5d28a-7810-4855-8899-509a22c241da-kube-api-access-c7lmg\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.647322 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.647434 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zn87\" (UniqueName: \"kubernetes.io/projected/6fa8093a-25d4-4468-ad97-c79cdc10bc71-kube-api-access-8zn87\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.647522 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-scripts\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.647691 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6fa8093a-25d4-4468-ad97-c79cdc10bc71-ceph\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.647763 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9af5d28a-7810-4855-8899-509a22c241da-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.647862 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-config-data\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.647983 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-config-data\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.648082 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.648224 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6fa8093a-25d4-4468-ad97-c79cdc10bc71-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.648998 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.654910 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fa8093a-25d4-4468-ad97-c79cdc10bc71-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.759844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-scripts\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.759900 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7lmg\" (UniqueName: \"kubernetes.io/projected/9af5d28a-7810-4855-8899-509a22c241da-kube-api-access-c7lmg\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.759934 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.759987 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zn87\" (UniqueName: \"kubernetes.io/projected/6fa8093a-25d4-4468-ad97-c79cdc10bc71-kube-api-access-8zn87\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760023 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-config\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-scripts\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760150 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6fa8093a-25d4-4468-ad97-c79cdc10bc71-ceph\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760176 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9af5d28a-7810-4855-8899-509a22c241da-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760219 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-config-data\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760253 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-dns-svc\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760329 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-config-data\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760367 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760424 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m5px\" (UniqueName: \"kubernetes.io/projected/bd64ad5c-f547-4c6d-8046-701a2f205431-kube-api-access-8m5px\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760474 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6fa8093a-25d4-4468-ad97-c79cdc10bc71-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760499 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-nb\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760531 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-sb\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760647 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760676 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fa8093a-25d4-4468-ad97-c79cdc10bc71-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.760757 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.766495 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.767740 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9af5d28a-7810-4855-8899-509a22c241da-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.767836 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fa8093a-25d4-4468-ad97-c79cdc10bc71-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.768159 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6fa8093a-25d4-4468-ad97-c79cdc10bc71-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.775310 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.779165 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.787370 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.793387 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-scripts\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.793805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-config-data\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.794075 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.793988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7lmg\" (UniqueName: \"kubernetes.io/projected/9af5d28a-7810-4855-8899-509a22c241da-kube-api-access-c7lmg\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.795270 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-config-data\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.796101 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6fa8093a-25d4-4468-ad97-c79cdc10bc71-ceph\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.798277 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.801009 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9af5d28a-7810-4855-8899-509a22c241da-scripts\") pod \"manila-scheduler-0\" (UID: \"9af5d28a-7810-4855-8899-509a22c241da\") " pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.803536 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa8093a-25d4-4468-ad97-c79cdc10bc71-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.813318 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zn87\" (UniqueName: \"kubernetes.io/projected/6fa8093a-25d4-4468-ad97-c79cdc10bc71-kube-api-access-8zn87\") pod \"manila-share-share1-0\" (UID: \"6fa8093a-25d4-4468-ad97-c79cdc10bc71\") " pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.854091 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.860451 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.862540 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-config\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.863797 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-dns-svc\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.863890 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-scripts\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.863977 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgdvx\" (UniqueName: \"kubernetes.io/projected/41ea1863-d069-41c1-ba7a-93d82581a18b-kube-api-access-wgdvx\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.864055 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.864117 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ea1863-d069-41c1-ba7a-93d82581a18b-logs\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.864221 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m5px\" (UniqueName: \"kubernetes.io/projected/bd64ad5c-f547-4c6d-8046-701a2f205431-kube-api-access-8m5px\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.864295 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-config-data-custom\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.864365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41ea1863-d069-41c1-ba7a-93d82581a18b-etc-machine-id\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.864525 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-nb\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.864657 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-sb\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.864743 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-config-data\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.865975 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-config\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.867511 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-dns-svc\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.867670 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-nb\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.868221 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-sb\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.883514 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.926715 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m5px\" (UniqueName: \"kubernetes.io/projected/bd64ad5c-f547-4c6d-8046-701a2f205431-kube-api-access-8m5px\") pod \"dnsmasq-dns-7876bb76fc-7m5cq\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.951438 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.967517 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-scripts\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.989005 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgdvx\" (UniqueName: \"kubernetes.io/projected/41ea1863-d069-41c1-ba7a-93d82581a18b-kube-api-access-wgdvx\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.989052 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.989081 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ea1863-d069-41c1-ba7a-93d82581a18b-logs\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.989216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-config-data-custom\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.989258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41ea1863-d069-41c1-ba7a-93d82581a18b-etc-machine-id\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.989302 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-config-data\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:28 crc kubenswrapper[4763]: I1006 16:37:28.990049 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ea1863-d069-41c1-ba7a-93d82581a18b-logs\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:29 crc kubenswrapper[4763]: I1006 16:37:29.015033 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41ea1863-d069-41c1-ba7a-93d82581a18b-etc-machine-id\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:29 crc kubenswrapper[4763]: I1006 16:37:29.027345 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:29 crc kubenswrapper[4763]: I1006 16:37:29.028079 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-scripts\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:29 crc kubenswrapper[4763]: I1006 16:37:29.039508 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-config-data-custom\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:29 crc kubenswrapper[4763]: I1006 16:37:29.040318 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ea1863-d069-41c1-ba7a-93d82581a18b-config-data\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:29 crc kubenswrapper[4763]: I1006 16:37:29.070414 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgdvx\" (UniqueName: \"kubernetes.io/projected/41ea1863-d069-41c1-ba7a-93d82581a18b-kube-api-access-wgdvx\") pod \"manila-api-0\" (UID: \"41ea1863-d069-41c1-ba7a-93d82581a18b\") " pod="openstack/manila-api-0" Oct 06 16:37:29 crc kubenswrapper[4763]: I1006 16:37:29.092526 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 06 16:37:29 crc kubenswrapper[4763]: I1006 16:37:29.580871 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:37:29 crc kubenswrapper[4763]: E1006 16:37:29.581718 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:37:29 crc kubenswrapper[4763]: I1006 16:37:29.901779 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 06 16:37:29 crc kubenswrapper[4763]: I1006 16:37:29.983849 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 06 16:37:29 crc kubenswrapper[4763]: W1006 16:37:29.988843 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fa8093a_25d4_4468_ad97_c79cdc10bc71.slice/crio-1e9a8ea98f9eef13d3a09457909d94ade203e6cf2ce5b954478fd04257a6c578 WatchSource:0}: Error finding container 1e9a8ea98f9eef13d3a09457909d94ade203e6cf2ce5b954478fd04257a6c578: Status 404 returned error can't find the container with id 1e9a8ea98f9eef13d3a09457909d94ade203e6cf2ce5b954478fd04257a6c578 Oct 06 16:37:30 crc kubenswrapper[4763]: I1006 16:37:30.143580 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 06 16:37:30 crc kubenswrapper[4763]: I1006 16:37:30.227348 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-7m5cq"] Oct 06 16:37:30 crc kubenswrapper[4763]: I1006 16:37:30.257064 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"41ea1863-d069-41c1-ba7a-93d82581a18b","Type":"ContainerStarted","Data":"8c766d7557ebc3a6401f8acc46cfeb42fb81a975aee96e099698ebf4967e1914"} Oct 06 16:37:30 crc kubenswrapper[4763]: I1006 16:37:30.258750 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" event={"ID":"bd64ad5c-f547-4c6d-8046-701a2f205431","Type":"ContainerStarted","Data":"8a300fd10bdbd58db2defc8858a187c5877fd5262782f06aa5239a13a115f7e5"} Oct 06 16:37:30 crc kubenswrapper[4763]: I1006 16:37:30.263678 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6fa8093a-25d4-4468-ad97-c79cdc10bc71","Type":"ContainerStarted","Data":"1e9a8ea98f9eef13d3a09457909d94ade203e6cf2ce5b954478fd04257a6c578"} Oct 06 16:37:30 crc kubenswrapper[4763]: I1006 16:37:30.264561 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9af5d28a-7810-4855-8899-509a22c241da","Type":"ContainerStarted","Data":"824c3349191561b324771a360c02a3d69af8099781486aa5b11c7b4358f92557"} Oct 06 16:37:31 crc kubenswrapper[4763]: I1006 16:37:31.292252 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9af5d28a-7810-4855-8899-509a22c241da","Type":"ContainerStarted","Data":"a1fc130703d7845c75bfe68dd82bb1ce1b5f38d42874ddd08028f13ee3bd2e14"} Oct 06 16:37:31 crc kubenswrapper[4763]: I1006 16:37:31.311413 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"41ea1863-d069-41c1-ba7a-93d82581a18b","Type":"ContainerStarted","Data":"67475e9d92f26b46f5d3faad4198720b8264f49d947f0d8074805071de5f7e1b"} Oct 06 16:37:31 crc kubenswrapper[4763]: I1006 16:37:31.311466 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"41ea1863-d069-41c1-ba7a-93d82581a18b","Type":"ContainerStarted","Data":"925baebe18d37492e9fd93676a8e8128491578956f87c80bd18bcb08678e5500"} Oct 06 16:37:31 crc kubenswrapper[4763]: I1006 16:37:31.311894 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 06 16:37:31 crc kubenswrapper[4763]: I1006 16:37:31.317126 4763 generic.go:334] "Generic (PLEG): container finished" podID="bd64ad5c-f547-4c6d-8046-701a2f205431" containerID="dece35b188bdbab644fa074f86b5f1216b3b3bb5b8f8c526654659f1a0dd7f17" exitCode=0 Oct 06 16:37:31 crc kubenswrapper[4763]: I1006 16:37:31.317172 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" event={"ID":"bd64ad5c-f547-4c6d-8046-701a2f205431","Type":"ContainerDied","Data":"dece35b188bdbab644fa074f86b5f1216b3b3bb5b8f8c526654659f1a0dd7f17"} Oct 06 16:37:31 crc kubenswrapper[4763]: I1006 16:37:31.333008 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.332991641 podStartE2EDuration="3.332991641s" podCreationTimestamp="2025-10-06 16:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:37:31.328801268 +0000 UTC m=+6248.484093810" watchObservedRunningTime="2025-10-06 16:37:31.332991641 +0000 UTC m=+6248.488284153" Oct 06 16:37:32 crc kubenswrapper[4763]: I1006 16:37:32.362484 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" event={"ID":"bd64ad5c-f547-4c6d-8046-701a2f205431","Type":"ContainerStarted","Data":"da84464a3e539736bd6da5e60f0fb1024043262c814eda4ad1c0642ea20b63b3"} Oct 06 16:37:32 crc kubenswrapper[4763]: I1006 16:37:32.363121 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:32 crc kubenswrapper[4763]: I1006 16:37:32.387657 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9af5d28a-7810-4855-8899-509a22c241da","Type":"ContainerStarted","Data":"d2635da200dd116d880b8fe85c42ab0932ff68403fad594d1f6d90ea66bb2626"} Oct 06 16:37:32 crc kubenswrapper[4763]: I1006 16:37:32.397662 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" podStartSLOduration=4.397646763 podStartE2EDuration="4.397646763s" podCreationTimestamp="2025-10-06 16:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:37:32.397080027 +0000 UTC m=+6249.552372529" watchObservedRunningTime="2025-10-06 16:37:32.397646763 +0000 UTC m=+6249.552939275" Oct 06 16:37:32 crc kubenswrapper[4763]: I1006 16:37:32.432473 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.7014174090000003 podStartE2EDuration="4.432455908s" podCreationTimestamp="2025-10-06 16:37:28 +0000 UTC" firstStartedPulling="2025-10-06 16:37:29.926656707 +0000 UTC m=+6247.081949209" lastFinishedPulling="2025-10-06 16:37:30.657695196 +0000 UTC m=+6247.812987708" observedRunningTime="2025-10-06 16:37:32.430118835 +0000 UTC m=+6249.585411347" watchObservedRunningTime="2025-10-06 16:37:32.432455908 +0000 UTC m=+6249.587748420" Oct 06 16:37:32 crc kubenswrapper[4763]: I1006 16:37:32.813023 4763 scope.go:117] "RemoveContainer" containerID="25c9fa62dc673320e1af2c553326145d88577980050a9e6653b43522d803a339" Oct 06 16:37:32 crc kubenswrapper[4763]: I1006 16:37:32.855775 4763 scope.go:117] "RemoveContainer" containerID="0ea19e1c9094053e750b7a5aadd32c1483cfb26c4e021366e6930b9aaae6fd7a" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.034693 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f2dz8"] Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.047338 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f2dz8"] Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.047463 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.198111 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cp7f\" (UniqueName: \"kubernetes.io/projected/8f8eb590-3a6f-4c0c-a576-acac2d84c133-kube-api-access-5cp7f\") pod \"redhat-operators-f2dz8\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.198524 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-utilities\") pod \"redhat-operators-f2dz8\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.198827 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-catalog-content\") pod \"redhat-operators-f2dz8\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.301084 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cp7f\" (UniqueName: \"kubernetes.io/projected/8f8eb590-3a6f-4c0c-a576-acac2d84c133-kube-api-access-5cp7f\") pod \"redhat-operators-f2dz8\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.301152 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-utilities\") pod \"redhat-operators-f2dz8\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.301278 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-catalog-content\") pod \"redhat-operators-f2dz8\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.302428 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-catalog-content\") pod \"redhat-operators-f2dz8\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.308628 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-utilities\") pod \"redhat-operators-f2dz8\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.320208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cp7f\" (UniqueName: \"kubernetes.io/projected/8f8eb590-3a6f-4c0c-a576-acac2d84c133-kube-api-access-5cp7f\") pod \"redhat-operators-f2dz8\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.381273 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:33 crc kubenswrapper[4763]: I1006 16:37:33.866339 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f2dz8"] Oct 06 16:37:33 crc kubenswrapper[4763]: W1006 16:37:33.884558 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f8eb590_3a6f_4c0c_a576_acac2d84c133.slice/crio-a62e050f5e582f46562cc1a1e26e487bcdf3e48884528b579edbb6bc1a898add WatchSource:0}: Error finding container a62e050f5e582f46562cc1a1e26e487bcdf3e48884528b579edbb6bc1a898add: Status 404 returned error can't find the container with id a62e050f5e582f46562cc1a1e26e487bcdf3e48884528b579edbb6bc1a898add Oct 06 16:37:34 crc kubenswrapper[4763]: I1006 16:37:34.426791 4763 generic.go:334] "Generic (PLEG): container finished" podID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerID="b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec" exitCode=0 Oct 06 16:37:34 crc kubenswrapper[4763]: I1006 16:37:34.426909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2dz8" event={"ID":"8f8eb590-3a6f-4c0c-a576-acac2d84c133","Type":"ContainerDied","Data":"b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec"} Oct 06 16:37:34 crc kubenswrapper[4763]: I1006 16:37:34.427124 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2dz8" event={"ID":"8f8eb590-3a6f-4c0c-a576-acac2d84c133","Type":"ContainerStarted","Data":"a62e050f5e582f46562cc1a1e26e487bcdf3e48884528b579edbb6bc1a898add"} Oct 06 16:37:37 crc kubenswrapper[4763]: I1006 16:37:37.460606 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6fa8093a-25d4-4468-ad97-c79cdc10bc71","Type":"ContainerStarted","Data":"504b80c254680e5a5436ced0dcb7b5b972e1a9ac3ab888378d467c2b0753bf4d"} Oct 06 16:37:38 crc kubenswrapper[4763]: I1006 16:37:38.476501 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6fa8093a-25d4-4468-ad97-c79cdc10bc71","Type":"ContainerStarted","Data":"2f3aeb76a076b19c3e87766f708747bbe91a529a5d0d67d5e8a3b3d5f255f2bb"} Oct 06 16:37:38 crc kubenswrapper[4763]: I1006 16:37:38.479663 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2dz8" event={"ID":"8f8eb590-3a6f-4c0c-a576-acac2d84c133","Type":"ContainerStarted","Data":"1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62"} Oct 06 16:37:38 crc kubenswrapper[4763]: I1006 16:37:38.508057 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.804662085 podStartE2EDuration="10.508040287s" podCreationTimestamp="2025-10-06 16:37:28 +0000 UTC" firstStartedPulling="2025-10-06 16:37:29.992493345 +0000 UTC m=+6247.147785857" lastFinishedPulling="2025-10-06 16:37:36.695871547 +0000 UTC m=+6253.851164059" observedRunningTime="2025-10-06 16:37:38.504315196 +0000 UTC m=+6255.659607708" watchObservedRunningTime="2025-10-06 16:37:38.508040287 +0000 UTC m=+6255.663332809" Oct 06 16:37:38 crc kubenswrapper[4763]: I1006 16:37:38.855842 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 06 16:37:38 crc kubenswrapper[4763]: I1006 16:37:38.884188 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 06 16:37:38 crc kubenswrapper[4763]: I1006 16:37:38.954132 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.032298 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-n2jvs"] Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.033047 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" podUID="1317520f-29bf-4615-b385-b03a9ffa898e" containerName="dnsmasq-dns" containerID="cri-o://ae1c758d43eec4438ddc622373d0efdbdad91508e76b510e36922681a2272269" gracePeriod=10 Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.495422 4763 generic.go:334] "Generic (PLEG): container finished" podID="1317520f-29bf-4615-b385-b03a9ffa898e" containerID="ae1c758d43eec4438ddc622373d0efdbdad91508e76b510e36922681a2272269" exitCode=0 Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.495497 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" event={"ID":"1317520f-29bf-4615-b385-b03a9ffa898e","Type":"ContainerDied","Data":"ae1c758d43eec4438ddc622373d0efdbdad91508e76b510e36922681a2272269"} Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.496773 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" event={"ID":"1317520f-29bf-4615-b385-b03a9ffa898e","Type":"ContainerDied","Data":"2d867ef64682c7d96f31fc3519e27ad901f04b9738ad8870b33cc69c7254cdee"} Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.496788 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d867ef64682c7d96f31fc3519e27ad901f04b9738ad8870b33cc69c7254cdee" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.573231 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.637403 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-dns-svc\") pod \"1317520f-29bf-4615-b385-b03a9ffa898e\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.637478 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-config\") pod \"1317520f-29bf-4615-b385-b03a9ffa898e\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.637520 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-sb\") pod \"1317520f-29bf-4615-b385-b03a9ffa898e\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.637565 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rttds\" (UniqueName: \"kubernetes.io/projected/1317520f-29bf-4615-b385-b03a9ffa898e-kube-api-access-rttds\") pod \"1317520f-29bf-4615-b385-b03a9ffa898e\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.637582 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-nb\") pod \"1317520f-29bf-4615-b385-b03a9ffa898e\" (UID: \"1317520f-29bf-4615-b385-b03a9ffa898e\") " Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.643637 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1317520f-29bf-4615-b385-b03a9ffa898e-kube-api-access-rttds" (OuterVolumeSpecName: "kube-api-access-rttds") pod "1317520f-29bf-4615-b385-b03a9ffa898e" (UID: "1317520f-29bf-4615-b385-b03a9ffa898e"). InnerVolumeSpecName "kube-api-access-rttds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.693292 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1317520f-29bf-4615-b385-b03a9ffa898e" (UID: "1317520f-29bf-4615-b385-b03a9ffa898e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.698472 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1317520f-29bf-4615-b385-b03a9ffa898e" (UID: "1317520f-29bf-4615-b385-b03a9ffa898e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.702064 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-config" (OuterVolumeSpecName: "config") pod "1317520f-29bf-4615-b385-b03a9ffa898e" (UID: "1317520f-29bf-4615-b385-b03a9ffa898e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.715550 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1317520f-29bf-4615-b385-b03a9ffa898e" (UID: "1317520f-29bf-4615-b385-b03a9ffa898e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.739992 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.740022 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.740036 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rttds\" (UniqueName: \"kubernetes.io/projected/1317520f-29bf-4615-b385-b03a9ffa898e-kube-api-access-rttds\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.740045 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:39 crc kubenswrapper[4763]: I1006 16:37:39.740053 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1317520f-29bf-4615-b385-b03a9ffa898e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:40 crc kubenswrapper[4763]: I1006 16:37:40.506407 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7784748f7f-n2jvs" Oct 06 16:37:40 crc kubenswrapper[4763]: I1006 16:37:40.545379 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-n2jvs"] Oct 06 16:37:40 crc kubenswrapper[4763]: I1006 16:37:40.554982 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-n2jvs"] Oct 06 16:37:41 crc kubenswrapper[4763]: I1006 16:37:41.518283 4763 generic.go:334] "Generic (PLEG): container finished" podID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerID="1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62" exitCode=0 Oct 06 16:37:41 crc kubenswrapper[4763]: I1006 16:37:41.518385 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2dz8" event={"ID":"8f8eb590-3a6f-4c0c-a576-acac2d84c133","Type":"ContainerDied","Data":"1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62"} Oct 06 16:37:41 crc kubenswrapper[4763]: I1006 16:37:41.522940 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 16:37:41 crc kubenswrapper[4763]: I1006 16:37:41.597191 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1317520f-29bf-4615-b385-b03a9ffa898e" path="/var/lib/kubelet/pods/1317520f-29bf-4615-b385-b03a9ffa898e/volumes" Oct 06 16:37:42 crc kubenswrapper[4763]: I1006 16:37:42.539288 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2dz8" event={"ID":"8f8eb590-3a6f-4c0c-a576-acac2d84c133","Type":"ContainerStarted","Data":"e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73"} Oct 06 16:37:42 crc kubenswrapper[4763]: I1006 16:37:42.569701 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f2dz8" podStartSLOduration=4.227653844 podStartE2EDuration="9.569684924s" podCreationTimestamp="2025-10-06 16:37:33 +0000 UTC" firstStartedPulling="2025-10-06 16:37:36.613714176 +0000 UTC m=+6253.769006688" lastFinishedPulling="2025-10-06 16:37:41.955745206 +0000 UTC m=+6259.111037768" observedRunningTime="2025-10-06 16:37:42.564055361 +0000 UTC m=+6259.719347873" watchObservedRunningTime="2025-10-06 16:37:42.569684924 +0000 UTC m=+6259.724977436" Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.029012 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.029336 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="ceilometer-central-agent" containerID="cri-o://b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a" gracePeriod=30 Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.029410 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="proxy-httpd" containerID="cri-o://56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82" gracePeriod=30 Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.029452 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="ceilometer-notification-agent" containerID="cri-o://78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2" gracePeriod=30 Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.029453 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="sg-core" containerID="cri-o://50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5" gracePeriod=30 Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.381927 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.382355 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.551999 4763 generic.go:334] "Generic (PLEG): container finished" podID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerID="56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82" exitCode=0 Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.552036 4763 generic.go:334] "Generic (PLEG): container finished" podID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerID="50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5" exitCode=2 Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.552045 4763 generic.go:334] "Generic (PLEG): container finished" podID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerID="b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a" exitCode=0 Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.552076 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"711d3e7e-7763-4236-b7b8-6b11f32eb091","Type":"ContainerDied","Data":"56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82"} Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.552132 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"711d3e7e-7763-4236-b7b8-6b11f32eb091","Type":"ContainerDied","Data":"50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5"} Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.552147 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"711d3e7e-7763-4236-b7b8-6b11f32eb091","Type":"ContainerDied","Data":"b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a"} Oct 06 16:37:43 crc kubenswrapper[4763]: I1006 16:37:43.586395 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:37:43 crc kubenswrapper[4763]: E1006 16:37:43.586685 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:37:44 crc kubenswrapper[4763]: I1006 16:37:44.428684 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f2dz8" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerName="registry-server" probeResult="failure" output=< Oct 06 16:37:44 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Oct 06 16:37:44 crc kubenswrapper[4763]: > Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.315316 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.516034 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-run-httpd\") pod \"711d3e7e-7763-4236-b7b8-6b11f32eb091\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.516502 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "711d3e7e-7763-4236-b7b8-6b11f32eb091" (UID: "711d3e7e-7763-4236-b7b8-6b11f32eb091"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.516530 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-combined-ca-bundle\") pod \"711d3e7e-7763-4236-b7b8-6b11f32eb091\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.516567 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67wnh\" (UniqueName: \"kubernetes.io/projected/711d3e7e-7763-4236-b7b8-6b11f32eb091-kube-api-access-67wnh\") pod \"711d3e7e-7763-4236-b7b8-6b11f32eb091\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.516633 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-scripts\") pod \"711d3e7e-7763-4236-b7b8-6b11f32eb091\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.516760 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-log-httpd\") pod \"711d3e7e-7763-4236-b7b8-6b11f32eb091\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.516787 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-config-data\") pod \"711d3e7e-7763-4236-b7b8-6b11f32eb091\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.516823 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-sg-core-conf-yaml\") pod \"711d3e7e-7763-4236-b7b8-6b11f32eb091\" (UID: \"711d3e7e-7763-4236-b7b8-6b11f32eb091\") " Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.517250 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "711d3e7e-7763-4236-b7b8-6b11f32eb091" (UID: "711d3e7e-7763-4236-b7b8-6b11f32eb091"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.517386 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.517400 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/711d3e7e-7763-4236-b7b8-6b11f32eb091-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.528900 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-scripts" (OuterVolumeSpecName: "scripts") pod "711d3e7e-7763-4236-b7b8-6b11f32eb091" (UID: "711d3e7e-7763-4236-b7b8-6b11f32eb091"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.529047 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711d3e7e-7763-4236-b7b8-6b11f32eb091-kube-api-access-67wnh" (OuterVolumeSpecName: "kube-api-access-67wnh") pod "711d3e7e-7763-4236-b7b8-6b11f32eb091" (UID: "711d3e7e-7763-4236-b7b8-6b11f32eb091"). InnerVolumeSpecName "kube-api-access-67wnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.549307 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "711d3e7e-7763-4236-b7b8-6b11f32eb091" (UID: "711d3e7e-7763-4236-b7b8-6b11f32eb091"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.600850 4763 generic.go:334] "Generic (PLEG): container finished" podID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerID="78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2" exitCode=0 Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.600968 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.619798 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.619832 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.619848 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67wnh\" (UniqueName: \"kubernetes.io/projected/711d3e7e-7763-4236-b7b8-6b11f32eb091-kube-api-access-67wnh\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.635313 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "711d3e7e-7763-4236-b7b8-6b11f32eb091" (UID: "711d3e7e-7763-4236-b7b8-6b11f32eb091"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.646877 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"711d3e7e-7763-4236-b7b8-6b11f32eb091","Type":"ContainerDied","Data":"78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2"} Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.646934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"711d3e7e-7763-4236-b7b8-6b11f32eb091","Type":"ContainerDied","Data":"097f82806b380d4830add2b970742ebfc6b72f5dc29c3ee67640183ba8fc48b5"} Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.647179 4763 scope.go:117] "RemoveContainer" containerID="56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.662488 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-config-data" (OuterVolumeSpecName: "config-data") pod "711d3e7e-7763-4236-b7b8-6b11f32eb091" (UID: "711d3e7e-7763-4236-b7b8-6b11f32eb091"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.683113 4763 scope.go:117] "RemoveContainer" containerID="50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.710457 4763 scope.go:117] "RemoveContainer" containerID="78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.721973 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.722002 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711d3e7e-7763-4236-b7b8-6b11f32eb091-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.747561 4763 scope.go:117] "RemoveContainer" containerID="b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.770040 4763 scope.go:117] "RemoveContainer" containerID="56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82" Oct 06 16:37:47 crc kubenswrapper[4763]: E1006 16:37:47.770520 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82\": container with ID starting with 56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82 not found: ID does not exist" containerID="56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.770565 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82"} err="failed to get container status \"56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82\": rpc error: code = NotFound desc = could not find container \"56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82\": container with ID starting with 56876e665e28027df205a7648e8e458286fe0245cff9ba1d2a5815db04592d82 not found: ID does not exist" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.770591 4763 scope.go:117] "RemoveContainer" containerID="50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5" Oct 06 16:37:47 crc kubenswrapper[4763]: E1006 16:37:47.771059 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5\": container with ID starting with 50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5 not found: ID does not exist" containerID="50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.771109 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5"} err="failed to get container status \"50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5\": rpc error: code = NotFound desc = could not find container \"50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5\": container with ID starting with 50b2a740ecc0390cc40392fb5b8ef82d8aef2a1c49a4cf5f452d0b5c60feabe5 not found: ID does not exist" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.771123 4763 scope.go:117] "RemoveContainer" containerID="78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2" Oct 06 16:37:47 crc kubenswrapper[4763]: E1006 16:37:47.771399 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2\": container with ID starting with 78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2 not found: ID does not exist" containerID="78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.771425 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2"} err="failed to get container status \"78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2\": rpc error: code = NotFound desc = could not find container \"78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2\": container with ID starting with 78954d4cdf9e9993cf4e86b03fc2a65d37fb0177ab20a2e400f3a2d726cdd7d2 not found: ID does not exist" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.771437 4763 scope.go:117] "RemoveContainer" containerID="b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a" Oct 06 16:37:47 crc kubenswrapper[4763]: E1006 16:37:47.771667 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a\": container with ID starting with b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a not found: ID does not exist" containerID="b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.771700 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a"} err="failed to get container status \"b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a\": rpc error: code = NotFound desc = could not find container \"b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a\": container with ID starting with b3fe1cc6f76dae6ad9341776d38607faa936b91876731759ef99d636b7704e6a not found: ID does not exist" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.954441 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.965331 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.980859 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:37:47 crc kubenswrapper[4763]: E1006 16:37:47.981333 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="proxy-httpd" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981352 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="proxy-httpd" Oct 06 16:37:47 crc kubenswrapper[4763]: E1006 16:37:47.981362 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1317520f-29bf-4615-b385-b03a9ffa898e" containerName="init" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981368 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1317520f-29bf-4615-b385-b03a9ffa898e" containerName="init" Oct 06 16:37:47 crc kubenswrapper[4763]: E1006 16:37:47.981382 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="ceilometer-notification-agent" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981389 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="ceilometer-notification-agent" Oct 06 16:37:47 crc kubenswrapper[4763]: E1006 16:37:47.981410 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="sg-core" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981416 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="sg-core" Oct 06 16:37:47 crc kubenswrapper[4763]: E1006 16:37:47.981424 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1317520f-29bf-4615-b385-b03a9ffa898e" containerName="dnsmasq-dns" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981430 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1317520f-29bf-4615-b385-b03a9ffa898e" containerName="dnsmasq-dns" Oct 06 16:37:47 crc kubenswrapper[4763]: E1006 16:37:47.981443 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="ceilometer-central-agent" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981449 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="ceilometer-central-agent" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981653 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="sg-core" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981673 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="proxy-httpd" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981688 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="ceilometer-central-agent" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981698 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1317520f-29bf-4615-b385-b03a9ffa898e" containerName="dnsmasq-dns" Oct 06 16:37:47 crc kubenswrapper[4763]: I1006 16:37:47.981709 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" containerName="ceilometer-notification-agent" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.009820 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.012414 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.013427 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.015057 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.032180 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-scripts\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.032299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.032382 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9073dd03-373e-4003-972e-b44569066488-run-httpd\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.032510 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9073dd03-373e-4003-972e-b44569066488-log-httpd\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.032682 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.032755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-config-data\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.032826 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfmj5\" (UniqueName: \"kubernetes.io/projected/9073dd03-373e-4003-972e-b44569066488-kube-api-access-zfmj5\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.134585 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-config-data\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.134703 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfmj5\" (UniqueName: \"kubernetes.io/projected/9073dd03-373e-4003-972e-b44569066488-kube-api-access-zfmj5\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.134885 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-scripts\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.134923 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.134946 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9073dd03-373e-4003-972e-b44569066488-run-httpd\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.135025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9073dd03-373e-4003-972e-b44569066488-log-httpd\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.135098 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.135574 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9073dd03-373e-4003-972e-b44569066488-log-httpd\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.135988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9073dd03-373e-4003-972e-b44569066488-run-httpd\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.138576 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.140224 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-config-data\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.140689 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.143715 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9073dd03-373e-4003-972e-b44569066488-scripts\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.152921 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfmj5\" (UniqueName: \"kubernetes.io/projected/9073dd03-373e-4003-972e-b44569066488-kube-api-access-zfmj5\") pod \"ceilometer-0\" (UID: \"9073dd03-373e-4003-972e-b44569066488\") " pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.344159 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 16:37:48 crc kubenswrapper[4763]: I1006 16:37:48.848307 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 16:37:49 crc kubenswrapper[4763]: I1006 16:37:49.590758 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="711d3e7e-7763-4236-b7b8-6b11f32eb091" path="/var/lib/kubelet/pods/711d3e7e-7763-4236-b7b8-6b11f32eb091/volumes" Oct 06 16:37:49 crc kubenswrapper[4763]: I1006 16:37:49.636874 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9073dd03-373e-4003-972e-b44569066488","Type":"ContainerStarted","Data":"0aa9884bf0aed6463852f37e367aa1424070527fd30a91bbd9613fa52ab2cdd3"} Oct 06 16:37:49 crc kubenswrapper[4763]: I1006 16:37:49.636917 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9073dd03-373e-4003-972e-b44569066488","Type":"ContainerStarted","Data":"a027529edda9c40ded4881d88e20ed466a9337fcb79129f3c8e9b007eee7a9c9"} Oct 06 16:37:50 crc kubenswrapper[4763]: I1006 16:37:50.414785 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 06 16:37:50 crc kubenswrapper[4763]: I1006 16:37:50.513699 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 06 16:37:50 crc kubenswrapper[4763]: I1006 16:37:50.649006 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9073dd03-373e-4003-972e-b44569066488","Type":"ContainerStarted","Data":"9d7125faa70c3aaaca2e3ef98f7aff176bbd79621983aa33b1ab5a8911302fc3"} Oct 06 16:37:50 crc kubenswrapper[4763]: I1006 16:37:50.665062 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 06 16:37:51 crc kubenswrapper[4763]: I1006 16:37:51.678792 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9073dd03-373e-4003-972e-b44569066488","Type":"ContainerStarted","Data":"75c014ea422edfcfed25282e6600f5d2221ad389c85289773841e1221ff2e02e"} Oct 06 16:37:53 crc kubenswrapper[4763]: I1006 16:37:53.701919 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9073dd03-373e-4003-972e-b44569066488","Type":"ContainerStarted","Data":"8be7ccab6afbb208ef3f86ddf3f4f9213a1c08e9686e1c9214000f1d92951b55"} Oct 06 16:37:53 crc kubenswrapper[4763]: I1006 16:37:53.702420 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 16:37:53 crc kubenswrapper[4763]: I1006 16:37:53.727762 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.086986207 podStartE2EDuration="6.72774399s" podCreationTimestamp="2025-10-06 16:37:47 +0000 UTC" firstStartedPulling="2025-10-06 16:37:48.852123161 +0000 UTC m=+6266.007415683" lastFinishedPulling="2025-10-06 16:37:52.492880934 +0000 UTC m=+6269.648173466" observedRunningTime="2025-10-06 16:37:53.717699007 +0000 UTC m=+6270.872991519" watchObservedRunningTime="2025-10-06 16:37:53.72774399 +0000 UTC m=+6270.883036502" Oct 06 16:37:54 crc kubenswrapper[4763]: I1006 16:37:54.493246 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f2dz8" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerName="registry-server" probeResult="failure" output=< Oct 06 16:37:54 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Oct 06 16:37:54 crc kubenswrapper[4763]: > Oct 06 16:37:56 crc kubenswrapper[4763]: I1006 16:37:56.575508 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:37:56 crc kubenswrapper[4763]: E1006 16:37:56.576443 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.295309 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6tjdt"] Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.299837 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.309903 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tjdt"] Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.442270 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-utilities\") pod \"certified-operators-6tjdt\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.442539 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw2x8\" (UniqueName: \"kubernetes.io/projected/bbd39c02-09cb-4397-a0f5-0e39554f2455-kube-api-access-pw2x8\") pod \"certified-operators-6tjdt\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.442739 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-catalog-content\") pod \"certified-operators-6tjdt\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.545094 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-utilities\") pod \"certified-operators-6tjdt\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.545206 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw2x8\" (UniqueName: \"kubernetes.io/projected/bbd39c02-09cb-4397-a0f5-0e39554f2455-kube-api-access-pw2x8\") pod \"certified-operators-6tjdt\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.545275 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-catalog-content\") pod \"certified-operators-6tjdt\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.545953 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-catalog-content\") pod \"certified-operators-6tjdt\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.546234 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-utilities\") pod \"certified-operators-6tjdt\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.566019 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw2x8\" (UniqueName: \"kubernetes.io/projected/bbd39c02-09cb-4397-a0f5-0e39554f2455-kube-api-access-pw2x8\") pod \"certified-operators-6tjdt\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:57 crc kubenswrapper[4763]: I1006 16:37:57.617721 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:37:58 crc kubenswrapper[4763]: I1006 16:37:58.233532 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tjdt"] Oct 06 16:37:58 crc kubenswrapper[4763]: I1006 16:37:58.753839 4763 generic.go:334] "Generic (PLEG): container finished" podID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerID="1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f" exitCode=0 Oct 06 16:37:58 crc kubenswrapper[4763]: I1006 16:37:58.753941 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tjdt" event={"ID":"bbd39c02-09cb-4397-a0f5-0e39554f2455","Type":"ContainerDied","Data":"1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f"} Oct 06 16:37:58 crc kubenswrapper[4763]: I1006 16:37:58.754109 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tjdt" event={"ID":"bbd39c02-09cb-4397-a0f5-0e39554f2455","Type":"ContainerStarted","Data":"d8a5bbe396130694249d2af08289650b04a4807553eabd8500742d5f62cf5323"} Oct 06 16:37:59 crc kubenswrapper[4763]: I1006 16:37:59.766862 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tjdt" event={"ID":"bbd39c02-09cb-4397-a0f5-0e39554f2455","Type":"ContainerStarted","Data":"a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac"} Oct 06 16:38:01 crc kubenswrapper[4763]: I1006 16:38:01.787789 4763 generic.go:334] "Generic (PLEG): container finished" podID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerID="a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac" exitCode=0 Oct 06 16:38:01 crc kubenswrapper[4763]: I1006 16:38:01.788147 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tjdt" event={"ID":"bbd39c02-09cb-4397-a0f5-0e39554f2455","Type":"ContainerDied","Data":"a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac"} Oct 06 16:38:03 crc kubenswrapper[4763]: I1006 16:38:03.432321 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:38:03 crc kubenswrapper[4763]: I1006 16:38:03.493144 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:38:03 crc kubenswrapper[4763]: I1006 16:38:03.830485 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tjdt" event={"ID":"bbd39c02-09cb-4397-a0f5-0e39554f2455","Type":"ContainerStarted","Data":"6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8"} Oct 06 16:38:03 crc kubenswrapper[4763]: I1006 16:38:03.852126 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6tjdt" podStartSLOduration=2.87484343 podStartE2EDuration="6.852106065s" podCreationTimestamp="2025-10-06 16:37:57 +0000 UTC" firstStartedPulling="2025-10-06 16:37:58.755871193 +0000 UTC m=+6275.911163705" lastFinishedPulling="2025-10-06 16:38:02.733133828 +0000 UTC m=+6279.888426340" observedRunningTime="2025-10-06 16:38:03.846981186 +0000 UTC m=+6281.002273708" watchObservedRunningTime="2025-10-06 16:38:03.852106065 +0000 UTC m=+6281.007398577" Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.262905 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f2dz8"] Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.264515 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f2dz8" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerName="registry-server" containerID="cri-o://e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73" gracePeriod=2 Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.811955 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.859203 4763 generic.go:334] "Generic (PLEG): container finished" podID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerID="e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73" exitCode=0 Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.859249 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2dz8" event={"ID":"8f8eb590-3a6f-4c0c-a576-acac2d84c133","Type":"ContainerDied","Data":"e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73"} Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.859310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2dz8" event={"ID":"8f8eb590-3a6f-4c0c-a576-acac2d84c133","Type":"ContainerDied","Data":"a62e050f5e582f46562cc1a1e26e487bcdf3e48884528b579edbb6bc1a898add"} Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.859333 4763 scope.go:117] "RemoveContainer" containerID="e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73" Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.859361 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2dz8" Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.895459 4763 scope.go:117] "RemoveContainer" containerID="1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62" Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.936197 4763 scope.go:117] "RemoveContainer" containerID="b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec" Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.937074 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-catalog-content\") pod \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.937140 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cp7f\" (UniqueName: \"kubernetes.io/projected/8f8eb590-3a6f-4c0c-a576-acac2d84c133-kube-api-access-5cp7f\") pod \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.937211 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-utilities\") pod \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\" (UID: \"8f8eb590-3a6f-4c0c-a576-acac2d84c133\") " Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.940004 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-utilities" (OuterVolumeSpecName: "utilities") pod "8f8eb590-3a6f-4c0c-a576-acac2d84c133" (UID: "8f8eb590-3a6f-4c0c-a576-acac2d84c133"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:38:05 crc kubenswrapper[4763]: I1006 16:38:05.944979 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f8eb590-3a6f-4c0c-a576-acac2d84c133-kube-api-access-5cp7f" (OuterVolumeSpecName: "kube-api-access-5cp7f") pod "8f8eb590-3a6f-4c0c-a576-acac2d84c133" (UID: "8f8eb590-3a6f-4c0c-a576-acac2d84c133"). InnerVolumeSpecName "kube-api-access-5cp7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.033818 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f8eb590-3a6f-4c0c-a576-acac2d84c133" (UID: "8f8eb590-3a6f-4c0c-a576-acac2d84c133"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.038452 4763 scope.go:117] "RemoveContainer" containerID="e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73" Oct 06 16:38:06 crc kubenswrapper[4763]: E1006 16:38:06.038959 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73\": container with ID starting with e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73 not found: ID does not exist" containerID="e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.039027 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73"} err="failed to get container status \"e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73\": rpc error: code = NotFound desc = could not find container \"e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73\": container with ID starting with e8ed4d8550890e0bf20037c130a531d9204f7fbdf81753fd1e92d32e430a0b73 not found: ID does not exist" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.039057 4763 scope.go:117] "RemoveContainer" containerID="1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62" Oct 06 16:38:06 crc kubenswrapper[4763]: E1006 16:38:06.039563 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62\": container with ID starting with 1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62 not found: ID does not exist" containerID="1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.039601 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62"} err="failed to get container status \"1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62\": rpc error: code = NotFound desc = could not find container \"1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62\": container with ID starting with 1f8941db799cbff8817ee1f4566713d0902706808ba8979b31aaad0deba93d62 not found: ID does not exist" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.039640 4763 scope.go:117] "RemoveContainer" containerID="b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec" Oct 06 16:38:06 crc kubenswrapper[4763]: E1006 16:38:06.040049 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec\": container with ID starting with b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec not found: ID does not exist" containerID="b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.040082 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec"} err="failed to get container status \"b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec\": rpc error: code = NotFound desc = could not find container \"b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec\": container with ID starting with b6ffa38df721879382a102f76e7c7a1ae26fd7ebb8aa96907d4413c95f214fec not found: ID does not exist" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.040503 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.040573 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cp7f\" (UniqueName: \"kubernetes.io/projected/8f8eb590-3a6f-4c0c-a576-acac2d84c133-kube-api-access-5cp7f\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.040592 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f8eb590-3a6f-4c0c-a576-acac2d84c133-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.193780 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f2dz8"] Oct 06 16:38:06 crc kubenswrapper[4763]: I1006 16:38:06.202161 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f2dz8"] Oct 06 16:38:07 crc kubenswrapper[4763]: I1006 16:38:07.597425 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" path="/var/lib/kubelet/pods/8f8eb590-3a6f-4c0c-a576-acac2d84c133/volumes" Oct 06 16:38:07 crc kubenswrapper[4763]: I1006 16:38:07.618685 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:38:07 crc kubenswrapper[4763]: I1006 16:38:07.619717 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:38:07 crc kubenswrapper[4763]: I1006 16:38:07.672287 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:38:07 crc kubenswrapper[4763]: I1006 16:38:07.961394 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:38:08 crc kubenswrapper[4763]: I1006 16:38:08.860297 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tjdt"] Oct 06 16:38:09 crc kubenswrapper[4763]: I1006 16:38:09.931396 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6tjdt" podUID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerName="registry-server" containerID="cri-o://6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8" gracePeriod=2 Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.592187 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.645526 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-catalog-content\") pod \"bbd39c02-09cb-4397-a0f5-0e39554f2455\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.645726 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-utilities\") pod \"bbd39c02-09cb-4397-a0f5-0e39554f2455\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.645819 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw2x8\" (UniqueName: \"kubernetes.io/projected/bbd39c02-09cb-4397-a0f5-0e39554f2455-kube-api-access-pw2x8\") pod \"bbd39c02-09cb-4397-a0f5-0e39554f2455\" (UID: \"bbd39c02-09cb-4397-a0f5-0e39554f2455\") " Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.648054 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-utilities" (OuterVolumeSpecName: "utilities") pod "bbd39c02-09cb-4397-a0f5-0e39554f2455" (UID: "bbd39c02-09cb-4397-a0f5-0e39554f2455"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.653294 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd39c02-09cb-4397-a0f5-0e39554f2455-kube-api-access-pw2x8" (OuterVolumeSpecName: "kube-api-access-pw2x8") pod "bbd39c02-09cb-4397-a0f5-0e39554f2455" (UID: "bbd39c02-09cb-4397-a0f5-0e39554f2455"). InnerVolumeSpecName "kube-api-access-pw2x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.689670 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbd39c02-09cb-4397-a0f5-0e39554f2455" (UID: "bbd39c02-09cb-4397-a0f5-0e39554f2455"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.752521 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.752832 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbd39c02-09cb-4397-a0f5-0e39554f2455-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.752856 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw2x8\" (UniqueName: \"kubernetes.io/projected/bbd39c02-09cb-4397-a0f5-0e39554f2455-kube-api-access-pw2x8\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.947911 4763 generic.go:334] "Generic (PLEG): container finished" podID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerID="6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8" exitCode=0 Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.947972 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tjdt" event={"ID":"bbd39c02-09cb-4397-a0f5-0e39554f2455","Type":"ContainerDied","Data":"6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8"} Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.948009 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tjdt" event={"ID":"bbd39c02-09cb-4397-a0f5-0e39554f2455","Type":"ContainerDied","Data":"d8a5bbe396130694249d2af08289650b04a4807553eabd8500742d5f62cf5323"} Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.948038 4763 scope.go:117] "RemoveContainer" containerID="6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8" Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.948239 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tjdt" Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.981230 4763 scope.go:117] "RemoveContainer" containerID="a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac" Oct 06 16:38:10 crc kubenswrapper[4763]: I1006 16:38:10.999083 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tjdt"] Oct 06 16:38:11 crc kubenswrapper[4763]: I1006 16:38:11.010795 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6tjdt"] Oct 06 16:38:11 crc kubenswrapper[4763]: I1006 16:38:11.023542 4763 scope.go:117] "RemoveContainer" containerID="1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f" Oct 06 16:38:11 crc kubenswrapper[4763]: I1006 16:38:11.055965 4763 scope.go:117] "RemoveContainer" containerID="6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8" Oct 06 16:38:11 crc kubenswrapper[4763]: E1006 16:38:11.056426 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8\": container with ID starting with 6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8 not found: ID does not exist" containerID="6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8" Oct 06 16:38:11 crc kubenswrapper[4763]: I1006 16:38:11.056495 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8"} err="failed to get container status \"6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8\": rpc error: code = NotFound desc = could not find container \"6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8\": container with ID starting with 6b69588b39384ec3362e18b224af111f64399864c5f5b3f91a6d1b4e7660b7e8 not found: ID does not exist" Oct 06 16:38:11 crc kubenswrapper[4763]: I1006 16:38:11.056521 4763 scope.go:117] "RemoveContainer" containerID="a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac" Oct 06 16:38:11 crc kubenswrapper[4763]: E1006 16:38:11.057028 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac\": container with ID starting with a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac not found: ID does not exist" containerID="a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac" Oct 06 16:38:11 crc kubenswrapper[4763]: I1006 16:38:11.057058 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac"} err="failed to get container status \"a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac\": rpc error: code = NotFound desc = could not find container \"a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac\": container with ID starting with a221cd32ef74e84a167805e91c7758cbc6b6774e2b010c002ff279c342c15eac not found: ID does not exist" Oct 06 16:38:11 crc kubenswrapper[4763]: I1006 16:38:11.057077 4763 scope.go:117] "RemoveContainer" containerID="1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f" Oct 06 16:38:11 crc kubenswrapper[4763]: E1006 16:38:11.057331 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f\": container with ID starting with 1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f not found: ID does not exist" containerID="1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f" Oct 06 16:38:11 crc kubenswrapper[4763]: I1006 16:38:11.057360 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f"} err="failed to get container status \"1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f\": rpc error: code = NotFound desc = could not find container \"1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f\": container with ID starting with 1c1c64b0f231ee15ba5c7c6ffcee6bdf5d5a03bf8c8b3b4bd1fbd4e20a311a1f not found: ID does not exist" Oct 06 16:38:11 crc kubenswrapper[4763]: I1006 16:38:11.575356 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:38:11 crc kubenswrapper[4763]: E1006 16:38:11.576014 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:38:11 crc kubenswrapper[4763]: I1006 16:38:11.598654 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbd39c02-09cb-4397-a0f5-0e39554f2455" path="/var/lib/kubelet/pods/bbd39c02-09cb-4397-a0f5-0e39554f2455/volumes" Oct 06 16:38:18 crc kubenswrapper[4763]: I1006 16:38:18.355369 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 16:38:26 crc kubenswrapper[4763]: I1006 16:38:26.575529 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:38:26 crc kubenswrapper[4763]: E1006 16:38:26.576529 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:38:33 crc kubenswrapper[4763]: I1006 16:38:33.111371 4763 scope.go:117] "RemoveContainer" containerID="ae1c758d43eec4438ddc622373d0efdbdad91508e76b510e36922681a2272269" Oct 06 16:38:33 crc kubenswrapper[4763]: I1006 16:38:33.141395 4763 scope.go:117] "RemoveContainer" containerID="16b1086aa399df279da15b3b684b672230925ed0d17ac793a0e91a03333f299b" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.627291 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-dqz8f"] Oct 06 16:38:38 crc kubenswrapper[4763]: E1006 16:38:38.628247 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerName="extract-content" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.628261 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerName="extract-content" Oct 06 16:38:38 crc kubenswrapper[4763]: E1006 16:38:38.628274 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerName="extract-utilities" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.628280 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerName="extract-utilities" Oct 06 16:38:38 crc kubenswrapper[4763]: E1006 16:38:38.628292 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerName="registry-server" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.628297 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerName="registry-server" Oct 06 16:38:38 crc kubenswrapper[4763]: E1006 16:38:38.628315 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerName="extract-utilities" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.628321 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerName="extract-utilities" Oct 06 16:38:38 crc kubenswrapper[4763]: E1006 16:38:38.628349 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerName="extract-content" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.628354 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerName="extract-content" Oct 06 16:38:38 crc kubenswrapper[4763]: E1006 16:38:38.628368 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerName="registry-server" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.628374 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerName="registry-server" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.628572 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd39c02-09cb-4397-a0f5-0e39554f2455" containerName="registry-server" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.628585 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8eb590-3a6f-4c0c-a576-acac2d84c133" containerName="registry-server" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.629700 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.631074 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.644745 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-dqz8f"] Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.651240 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-openstack-cell1\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.651306 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-nb\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.651470 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-dns-svc\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.653210 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-config\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.653238 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-sb\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.653269 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx8rc\" (UniqueName: \"kubernetes.io/projected/f777ffbe-af34-4643-9253-6768a327dd6e-kube-api-access-cx8rc\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.754279 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-dns-svc\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.754341 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-config\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.754364 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-sb\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.754390 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8rc\" (UniqueName: \"kubernetes.io/projected/f777ffbe-af34-4643-9253-6768a327dd6e-kube-api-access-cx8rc\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.754469 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-openstack-cell1\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.754500 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-nb\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.755720 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-dns-svc\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.755751 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-config\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.756126 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-openstack-cell1\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.756197 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-sb\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.756413 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-nb\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.789273 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx8rc\" (UniqueName: \"kubernetes.io/projected/f777ffbe-af34-4643-9253-6768a327dd6e-kube-api-access-cx8rc\") pod \"dnsmasq-dns-65f77b9c99-dqz8f\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:38 crc kubenswrapper[4763]: I1006 16:38:38.953330 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:39 crc kubenswrapper[4763]: I1006 16:38:39.669810 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-dqz8f"] Oct 06 16:38:40 crc kubenswrapper[4763]: I1006 16:38:40.272799 4763 generic.go:334] "Generic (PLEG): container finished" podID="f777ffbe-af34-4643-9253-6768a327dd6e" containerID="b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691" exitCode=0 Oct 06 16:38:40 crc kubenswrapper[4763]: I1006 16:38:40.272979 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" event={"ID":"f777ffbe-af34-4643-9253-6768a327dd6e","Type":"ContainerDied","Data":"b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691"} Oct 06 16:38:40 crc kubenswrapper[4763]: I1006 16:38:40.273261 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" event={"ID":"f777ffbe-af34-4643-9253-6768a327dd6e","Type":"ContainerStarted","Data":"5754a672de7758ae0b5db1380508344aa496ad371eb6cbca2608eb9e48b10fbd"} Oct 06 16:38:40 crc kubenswrapper[4763]: I1006 16:38:40.575470 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:38:41 crc kubenswrapper[4763]: I1006 16:38:41.289188 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"854ab145110d0f5a70c5d9b9abb6d45fd5cb2bb06709d04a1b016ba93727c81a"} Oct 06 16:38:41 crc kubenswrapper[4763]: I1006 16:38:41.291842 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" event={"ID":"f777ffbe-af34-4643-9253-6768a327dd6e","Type":"ContainerStarted","Data":"70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31"} Oct 06 16:38:41 crc kubenswrapper[4763]: I1006 16:38:41.292123 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:41 crc kubenswrapper[4763]: I1006 16:38:41.345856 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" podStartSLOduration=3.34478483 podStartE2EDuration="3.34478483s" podCreationTimestamp="2025-10-06 16:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:38:41.336012251 +0000 UTC m=+6318.491304763" watchObservedRunningTime="2025-10-06 16:38:41.34478483 +0000 UTC m=+6318.500077342" Oct 06 16:38:48 crc kubenswrapper[4763]: I1006 16:38:48.954816 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.058751 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-7m5cq"] Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.059037 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" podUID="bd64ad5c-f547-4c6d-8046-701a2f205431" containerName="dnsmasq-dns" containerID="cri-o://da84464a3e539736bd6da5e60f0fb1024043262c814eda4ad1c0642ea20b63b3" gracePeriod=10 Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.219767 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-df8f9c6bc-tjwkx"] Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.222456 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.229547 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df8f9c6bc-tjwkx"] Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.405739 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-ovsdbserver-nb\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.406082 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-dns-svc\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.406170 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-openstack-cell1\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.406235 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-ovsdbserver-sb\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.406281 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-config\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.406346 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5q7c\" (UniqueName: \"kubernetes.io/projected/71d20e8b-1ab2-4024-bd3c-4651186071c5-kube-api-access-t5q7c\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.425471 4763 generic.go:334] "Generic (PLEG): container finished" podID="bd64ad5c-f547-4c6d-8046-701a2f205431" containerID="da84464a3e539736bd6da5e60f0fb1024043262c814eda4ad1c0642ea20b63b3" exitCode=0 Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.425519 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" event={"ID":"bd64ad5c-f547-4c6d-8046-701a2f205431","Type":"ContainerDied","Data":"da84464a3e539736bd6da5e60f0fb1024043262c814eda4ad1c0642ea20b63b3"} Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.508789 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-dns-svc\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.508926 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-openstack-cell1\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.508997 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-ovsdbserver-sb\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.509043 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-config\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.509101 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5q7c\" (UniqueName: \"kubernetes.io/projected/71d20e8b-1ab2-4024-bd3c-4651186071c5-kube-api-access-t5q7c\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.509161 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-ovsdbserver-nb\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.509896 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-dns-svc\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.509969 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-ovsdbserver-sb\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.510590 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-config\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.511269 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-openstack-cell1\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.511656 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71d20e8b-1ab2-4024-bd3c-4651186071c5-ovsdbserver-nb\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.537851 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5q7c\" (UniqueName: \"kubernetes.io/projected/71d20e8b-1ab2-4024-bd3c-4651186071c5-kube-api-access-t5q7c\") pod \"dnsmasq-dns-df8f9c6bc-tjwkx\" (UID: \"71d20e8b-1ab2-4024-bd3c-4651186071c5\") " pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.553531 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.692948 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.816536 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m5px\" (UniqueName: \"kubernetes.io/projected/bd64ad5c-f547-4c6d-8046-701a2f205431-kube-api-access-8m5px\") pod \"bd64ad5c-f547-4c6d-8046-701a2f205431\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.816691 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-sb\") pod \"bd64ad5c-f547-4c6d-8046-701a2f205431\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.816794 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-nb\") pod \"bd64ad5c-f547-4c6d-8046-701a2f205431\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.816855 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-dns-svc\") pod \"bd64ad5c-f547-4c6d-8046-701a2f205431\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.817062 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-config\") pod \"bd64ad5c-f547-4c6d-8046-701a2f205431\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.828351 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd64ad5c-f547-4c6d-8046-701a2f205431-kube-api-access-8m5px" (OuterVolumeSpecName: "kube-api-access-8m5px") pod "bd64ad5c-f547-4c6d-8046-701a2f205431" (UID: "bd64ad5c-f547-4c6d-8046-701a2f205431"). InnerVolumeSpecName "kube-api-access-8m5px". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.898472 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bd64ad5c-f547-4c6d-8046-701a2f205431" (UID: "bd64ad5c-f547-4c6d-8046-701a2f205431"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.906156 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bd64ad5c-f547-4c6d-8046-701a2f205431" (UID: "bd64ad5c-f547-4c6d-8046-701a2f205431"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.918737 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-config" (OuterVolumeSpecName: "config") pod "bd64ad5c-f547-4c6d-8046-701a2f205431" (UID: "bd64ad5c-f547-4c6d-8046-701a2f205431"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.919575 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-config\") pod \"bd64ad5c-f547-4c6d-8046-701a2f205431\" (UID: \"bd64ad5c-f547-4c6d-8046-701a2f205431\") " Oct 06 16:38:49 crc kubenswrapper[4763]: W1006 16:38:49.919999 4763 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bd64ad5c-f547-4c6d-8046-701a2f205431/volumes/kubernetes.io~configmap/config Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.920017 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-config" (OuterVolumeSpecName: "config") pod "bd64ad5c-f547-4c6d-8046-701a2f205431" (UID: "bd64ad5c-f547-4c6d-8046-701a2f205431"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.920543 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.920590 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.920602 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m5px\" (UniqueName: \"kubernetes.io/projected/bd64ad5c-f547-4c6d-8046-701a2f205431-kube-api-access-8m5px\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.920666 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:49 crc kubenswrapper[4763]: I1006 16:38:49.928487 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd64ad5c-f547-4c6d-8046-701a2f205431" (UID: "bd64ad5c-f547-4c6d-8046-701a2f205431"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.022359 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd64ad5c-f547-4c6d-8046-701a2f205431-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:38:50 crc kubenswrapper[4763]: W1006 16:38:50.067999 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71d20e8b_1ab2_4024_bd3c_4651186071c5.slice/crio-79bc278869dc1fcc49d1f9d1b2a3ea2d57c549ef5bc1437b8ff94623ef6669ce WatchSource:0}: Error finding container 79bc278869dc1fcc49d1f9d1b2a3ea2d57c549ef5bc1437b8ff94623ef6669ce: Status 404 returned error can't find the container with id 79bc278869dc1fcc49d1f9d1b2a3ea2d57c549ef5bc1437b8ff94623ef6669ce Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.068061 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df8f9c6bc-tjwkx"] Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.435800 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.435799 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bb76fc-7m5cq" event={"ID":"bd64ad5c-f547-4c6d-8046-701a2f205431","Type":"ContainerDied","Data":"8a300fd10bdbd58db2defc8858a187c5877fd5262782f06aa5239a13a115f7e5"} Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.435918 4763 scope.go:117] "RemoveContainer" containerID="da84464a3e539736bd6da5e60f0fb1024043262c814eda4ad1c0642ea20b63b3" Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.438541 4763 generic.go:334] "Generic (PLEG): container finished" podID="71d20e8b-1ab2-4024-bd3c-4651186071c5" containerID="86b4ce9643a2b29ba4b44edc923b5ab8b8e240d96711fc934dfbc273b95c6e43" exitCode=0 Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.438582 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" event={"ID":"71d20e8b-1ab2-4024-bd3c-4651186071c5","Type":"ContainerDied","Data":"86b4ce9643a2b29ba4b44edc923b5ab8b8e240d96711fc934dfbc273b95c6e43"} Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.438654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" event={"ID":"71d20e8b-1ab2-4024-bd3c-4651186071c5","Type":"ContainerStarted","Data":"79bc278869dc1fcc49d1f9d1b2a3ea2d57c549ef5bc1437b8ff94623ef6669ce"} Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.482814 4763 scope.go:117] "RemoveContainer" containerID="dece35b188bdbab644fa074f86b5f1216b3b3bb5b8f8c526654659f1a0dd7f17" Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.666849 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-7m5cq"] Oct 06 16:38:50 crc kubenswrapper[4763]: I1006 16:38:50.676092 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-7m5cq"] Oct 06 16:38:51 crc kubenswrapper[4763]: I1006 16:38:51.452784 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" event={"ID":"71d20e8b-1ab2-4024-bd3c-4651186071c5","Type":"ContainerStarted","Data":"cd862918f03228517e9f6243993e387b04bdf93483a77c3d56a914fc37b6e1ab"} Oct 06 16:38:51 crc kubenswrapper[4763]: I1006 16:38:51.453278 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:51 crc kubenswrapper[4763]: I1006 16:38:51.474922 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" podStartSLOduration=2.474903692 podStartE2EDuration="2.474903692s" podCreationTimestamp="2025-10-06 16:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 16:38:51.4733689 +0000 UTC m=+6328.628661482" watchObservedRunningTime="2025-10-06 16:38:51.474903692 +0000 UTC m=+6328.630196204" Oct 06 16:38:51 crc kubenswrapper[4763]: I1006 16:38:51.590172 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd64ad5c-f547-4c6d-8046-701a2f205431" path="/var/lib/kubelet/pods/bd64ad5c-f547-4c6d-8046-701a2f205431/volumes" Oct 06 16:38:59 crc kubenswrapper[4763]: I1006 16:38:59.555830 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-df8f9c6bc-tjwkx" Oct 06 16:38:59 crc kubenswrapper[4763]: I1006 16:38:59.632976 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-dqz8f"] Oct 06 16:38:59 crc kubenswrapper[4763]: I1006 16:38:59.633251 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" podUID="f777ffbe-af34-4643-9253-6768a327dd6e" containerName="dnsmasq-dns" containerID="cri-o://70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31" gracePeriod=10 Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.209714 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.318800 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-sb\") pod \"f777ffbe-af34-4643-9253-6768a327dd6e\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.319249 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-nb\") pod \"f777ffbe-af34-4643-9253-6768a327dd6e\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.319394 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-openstack-cell1\") pod \"f777ffbe-af34-4643-9253-6768a327dd6e\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.319463 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-dns-svc\") pod \"f777ffbe-af34-4643-9253-6768a327dd6e\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.319640 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-config\") pod \"f777ffbe-af34-4643-9253-6768a327dd6e\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.319730 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx8rc\" (UniqueName: \"kubernetes.io/projected/f777ffbe-af34-4643-9253-6768a327dd6e-kube-api-access-cx8rc\") pod \"f777ffbe-af34-4643-9253-6768a327dd6e\" (UID: \"f777ffbe-af34-4643-9253-6768a327dd6e\") " Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.329285 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f777ffbe-af34-4643-9253-6768a327dd6e-kube-api-access-cx8rc" (OuterVolumeSpecName: "kube-api-access-cx8rc") pod "f777ffbe-af34-4643-9253-6768a327dd6e" (UID: "f777ffbe-af34-4643-9253-6768a327dd6e"). InnerVolumeSpecName "kube-api-access-cx8rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.383396 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f777ffbe-af34-4643-9253-6768a327dd6e" (UID: "f777ffbe-af34-4643-9253-6768a327dd6e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.390609 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-config" (OuterVolumeSpecName: "config") pod "f777ffbe-af34-4643-9253-6768a327dd6e" (UID: "f777ffbe-af34-4643-9253-6768a327dd6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.398269 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f777ffbe-af34-4643-9253-6768a327dd6e" (UID: "f777ffbe-af34-4643-9253-6768a327dd6e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.399216 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "f777ffbe-af34-4643-9253-6768a327dd6e" (UID: "f777ffbe-af34-4643-9253-6768a327dd6e"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.406499 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f777ffbe-af34-4643-9253-6768a327dd6e" (UID: "f777ffbe-af34-4643-9253-6768a327dd6e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.423349 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-config\") on node \"crc\" DevicePath \"\"" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.423394 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx8rc\" (UniqueName: \"kubernetes.io/projected/f777ffbe-af34-4643-9253-6768a327dd6e-kube-api-access-cx8rc\") on node \"crc\" DevicePath \"\"" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.423411 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.423422 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.423434 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.423446 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f777ffbe-af34-4643-9253-6768a327dd6e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.548709 4763 generic.go:334] "Generic (PLEG): container finished" podID="f777ffbe-af34-4643-9253-6768a327dd6e" containerID="70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31" exitCode=0 Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.548786 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" event={"ID":"f777ffbe-af34-4643-9253-6768a327dd6e","Type":"ContainerDied","Data":"70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31"} Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.548820 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" event={"ID":"f777ffbe-af34-4643-9253-6768a327dd6e","Type":"ContainerDied","Data":"5754a672de7758ae0b5db1380508344aa496ad371eb6cbca2608eb9e48b10fbd"} Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.548842 4763 scope.go:117] "RemoveContainer" containerID="70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.549077 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65f77b9c99-dqz8f" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.588904 4763 scope.go:117] "RemoveContainer" containerID="b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.608794 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-dqz8f"] Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.616153 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-dqz8f"] Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.622426 4763 scope.go:117] "RemoveContainer" containerID="70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31" Oct 06 16:39:00 crc kubenswrapper[4763]: E1006 16:39:00.623018 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31\": container with ID starting with 70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31 not found: ID does not exist" containerID="70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.623067 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31"} err="failed to get container status \"70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31\": rpc error: code = NotFound desc = could not find container \"70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31\": container with ID starting with 70e84cd83cdeee7a262163ee8ede808c60a0e0097ba78c37d62dfbcffe7fae31 not found: ID does not exist" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.623097 4763 scope.go:117] "RemoveContainer" containerID="b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691" Oct 06 16:39:00 crc kubenswrapper[4763]: E1006 16:39:00.623583 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691\": container with ID starting with b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691 not found: ID does not exist" containerID="b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691" Oct 06 16:39:00 crc kubenswrapper[4763]: I1006 16:39:00.623639 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691"} err="failed to get container status \"b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691\": rpc error: code = NotFound desc = could not find container \"b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691\": container with ID starting with b66a7caea192c394a3e6812c4da5d84015601679ce3700e5b82612019ab7d691 not found: ID does not exist" Oct 06 16:39:01 crc kubenswrapper[4763]: I1006 16:39:01.591191 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f777ffbe-af34-4643-9253-6768a327dd6e" path="/var/lib/kubelet/pods/f777ffbe-af34-4643-9253-6768a327dd6e/volumes" Oct 06 16:39:25 crc kubenswrapper[4763]: I1006 16:39:25.045643 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-kpc9x"] Oct 06 16:39:25 crc kubenswrapper[4763]: I1006 16:39:25.058822 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-kpc9x"] Oct 06 16:39:25 crc kubenswrapper[4763]: I1006 16:39:25.591562 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e669be3-799e-4707-85e3-68ae9c66cdaf" path="/var/lib/kubelet/pods/0e669be3-799e-4707-85e3-68ae9c66cdaf/volumes" Oct 06 16:39:33 crc kubenswrapper[4763]: I1006 16:39:33.258276 4763 scope.go:117] "RemoveContainer" containerID="746d8d2f42749b85895dc863c90929f148a3214c0fe4ffb999d9f0a94ce9637c" Oct 06 16:39:37 crc kubenswrapper[4763]: I1006 16:39:37.042875 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-452a-account-create-dtrk2"] Oct 06 16:39:37 crc kubenswrapper[4763]: I1006 16:39:37.054217 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-452a-account-create-dtrk2"] Oct 06 16:39:37 crc kubenswrapper[4763]: I1006 16:39:37.606956 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a671e8-02b3-4112-bc33-dcbf2cbe206a" path="/var/lib/kubelet/pods/50a671e8-02b3-4112-bc33-dcbf2cbe206a/volumes" Oct 06 16:39:43 crc kubenswrapper[4763]: I1006 16:39:43.045546 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-p26fh"] Oct 06 16:39:43 crc kubenswrapper[4763]: I1006 16:39:43.063881 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-p26fh"] Oct 06 16:39:43 crc kubenswrapper[4763]: I1006 16:39:43.601798 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5f2bac-bb4b-4d51-b29d-6af33cb541d1" path="/var/lib/kubelet/pods/3c5f2bac-bb4b-4d51-b29d-6af33cb541d1/volumes" Oct 06 16:39:54 crc kubenswrapper[4763]: I1006 16:39:54.059023 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-6080-account-create-rrkhw"] Oct 06 16:39:54 crc kubenswrapper[4763]: I1006 16:39:54.073464 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-6080-account-create-rrkhw"] Oct 06 16:39:55 crc kubenswrapper[4763]: I1006 16:39:55.591934 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2a63b6-8ba6-4805-9a86-477196011cb3" path="/var/lib/kubelet/pods/7e2a63b6-8ba6-4805-9a86-477196011cb3/volumes" Oct 06 16:40:30 crc kubenswrapper[4763]: I1006 16:40:30.062285 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-g858t"] Oct 06 16:40:30 crc kubenswrapper[4763]: I1006 16:40:30.077578 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-g858t"] Oct 06 16:40:31 crc kubenswrapper[4763]: I1006 16:40:31.593754 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0" path="/var/lib/kubelet/pods/4a9b1f9d-8dae-45f6-b74c-ba7153afa5e0/volumes" Oct 06 16:40:33 crc kubenswrapper[4763]: I1006 16:40:33.402546 4763 scope.go:117] "RemoveContainer" containerID="04dfb7716644033f1416b3528f59348236e173d8feeff77279d1e50a2565b39d" Oct 06 16:40:33 crc kubenswrapper[4763]: I1006 16:40:33.457426 4763 scope.go:117] "RemoveContainer" containerID="52ce0e843e0439bfe7ae2d2c2ad0a27ec9b548b4e7da5f65ab8a7827e1d03bc7" Oct 06 16:40:33 crc kubenswrapper[4763]: I1006 16:40:33.506041 4763 scope.go:117] "RemoveContainer" containerID="b28a1e8381562c7f40031d57470887f593974afb65c87f52c5fc196a9c48292c" Oct 06 16:40:33 crc kubenswrapper[4763]: I1006 16:40:33.550434 4763 scope.go:117] "RemoveContainer" containerID="696dbf734d2dfb950954cd71adc736367d290fe7c655783de0fff888995a8be0" Oct 06 16:40:33 crc kubenswrapper[4763]: I1006 16:40:33.598062 4763 scope.go:117] "RemoveContainer" containerID="e796b2dae6f1a255cd2e3b61398559032408ff2e8e713cb8186a90663e4ded42" Oct 06 16:41:03 crc kubenswrapper[4763]: I1006 16:41:03.877609 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:41:03 crc kubenswrapper[4763]: I1006 16:41:03.878317 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:41:33 crc kubenswrapper[4763]: I1006 16:41:33.877507 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:41:33 crc kubenswrapper[4763]: I1006 16:41:33.878408 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:42:03 crc kubenswrapper[4763]: I1006 16:42:03.877165 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:42:03 crc kubenswrapper[4763]: I1006 16:42:03.880437 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:42:03 crc kubenswrapper[4763]: I1006 16:42:03.880596 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 16:42:03 crc kubenswrapper[4763]: I1006 16:42:03.881462 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"854ab145110d0f5a70c5d9b9abb6d45fd5cb2bb06709d04a1b016ba93727c81a"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 16:42:03 crc kubenswrapper[4763]: I1006 16:42:03.881645 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://854ab145110d0f5a70c5d9b9abb6d45fd5cb2bb06709d04a1b016ba93727c81a" gracePeriod=600 Oct 06 16:42:04 crc kubenswrapper[4763]: I1006 16:42:04.781887 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="854ab145110d0f5a70c5d9b9abb6d45fd5cb2bb06709d04a1b016ba93727c81a" exitCode=0 Oct 06 16:42:04 crc kubenswrapper[4763]: I1006 16:42:04.782005 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"854ab145110d0f5a70c5d9b9abb6d45fd5cb2bb06709d04a1b016ba93727c81a"} Oct 06 16:42:04 crc kubenswrapper[4763]: I1006 16:42:04.782571 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22"} Oct 06 16:42:04 crc kubenswrapper[4763]: I1006 16:42:04.782598 4763 scope.go:117] "RemoveContainer" containerID="2e875e311413841a9d1bbd2ef2352512968714447a977bb20391ac60b03cf428" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.000730 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2ptgq"] Oct 06 16:42:11 crc kubenswrapper[4763]: E1006 16:42:11.002013 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f777ffbe-af34-4643-9253-6768a327dd6e" containerName="dnsmasq-dns" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.002028 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f777ffbe-af34-4643-9253-6768a327dd6e" containerName="dnsmasq-dns" Oct 06 16:42:11 crc kubenswrapper[4763]: E1006 16:42:11.002043 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd64ad5c-f547-4c6d-8046-701a2f205431" containerName="init" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.002049 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd64ad5c-f547-4c6d-8046-701a2f205431" containerName="init" Oct 06 16:42:11 crc kubenswrapper[4763]: E1006 16:42:11.002060 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd64ad5c-f547-4c6d-8046-701a2f205431" containerName="dnsmasq-dns" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.002067 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd64ad5c-f547-4c6d-8046-701a2f205431" containerName="dnsmasq-dns" Oct 06 16:42:11 crc kubenswrapper[4763]: E1006 16:42:11.002086 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f777ffbe-af34-4643-9253-6768a327dd6e" containerName="init" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.002094 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f777ffbe-af34-4643-9253-6768a327dd6e" containerName="init" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.002337 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f777ffbe-af34-4643-9253-6768a327dd6e" containerName="dnsmasq-dns" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.002352 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd64ad5c-f547-4c6d-8046-701a2f205431" containerName="dnsmasq-dns" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.003960 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.011836 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2ptgq"] Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.019210 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-utilities\") pod \"community-operators-2ptgq\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.019391 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pcnx\" (UniqueName: \"kubernetes.io/projected/723c30ec-916d-4f88-9f67-c19af8d4b402-kube-api-access-5pcnx\") pod \"community-operators-2ptgq\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.019458 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-catalog-content\") pod \"community-operators-2ptgq\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.121135 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-catalog-content\") pod \"community-operators-2ptgq\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.121193 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-utilities\") pod \"community-operators-2ptgq\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.121316 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pcnx\" (UniqueName: \"kubernetes.io/projected/723c30ec-916d-4f88-9f67-c19af8d4b402-kube-api-access-5pcnx\") pod \"community-operators-2ptgq\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.121778 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-catalog-content\") pod \"community-operators-2ptgq\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.121950 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-utilities\") pod \"community-operators-2ptgq\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.138487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pcnx\" (UniqueName: \"kubernetes.io/projected/723c30ec-916d-4f88-9f67-c19af8d4b402-kube-api-access-5pcnx\") pod \"community-operators-2ptgq\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.321349 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.779920 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2ptgq"] Oct 06 16:42:11 crc kubenswrapper[4763]: I1006 16:42:11.889829 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2ptgq" event={"ID":"723c30ec-916d-4f88-9f67-c19af8d4b402","Type":"ContainerStarted","Data":"4f889cfc3aa8cb319d52610f5034cb22517583e4f20a787cbbdab49046f9d77b"} Oct 06 16:42:12 crc kubenswrapper[4763]: E1006 16:42:12.251323 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod723c30ec_916d_4f88_9f67_c19af8d4b402.slice/crio-conmon-084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod723c30ec_916d_4f88_9f67_c19af8d4b402.slice/crio-084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a.scope\": RecentStats: unable to find data in memory cache]" Oct 06 16:42:12 crc kubenswrapper[4763]: I1006 16:42:12.916426 4763 generic.go:334] "Generic (PLEG): container finished" podID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerID="084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a" exitCode=0 Oct 06 16:42:12 crc kubenswrapper[4763]: I1006 16:42:12.916490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2ptgq" event={"ID":"723c30ec-916d-4f88-9f67-c19af8d4b402","Type":"ContainerDied","Data":"084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a"} Oct 06 16:42:13 crc kubenswrapper[4763]: I1006 16:42:13.928945 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2ptgq" event={"ID":"723c30ec-916d-4f88-9f67-c19af8d4b402","Type":"ContainerStarted","Data":"c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c"} Oct 06 16:42:14 crc kubenswrapper[4763]: I1006 16:42:14.943577 4763 generic.go:334] "Generic (PLEG): container finished" podID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerID="c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c" exitCode=0 Oct 06 16:42:14 crc kubenswrapper[4763]: I1006 16:42:14.943642 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2ptgq" event={"ID":"723c30ec-916d-4f88-9f67-c19af8d4b402","Type":"ContainerDied","Data":"c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c"} Oct 06 16:42:16 crc kubenswrapper[4763]: I1006 16:42:16.963466 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2ptgq" event={"ID":"723c30ec-916d-4f88-9f67-c19af8d4b402","Type":"ContainerStarted","Data":"a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f"} Oct 06 16:42:16 crc kubenswrapper[4763]: I1006 16:42:16.987219 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2ptgq" podStartSLOduration=4.17792687 podStartE2EDuration="6.987198976s" podCreationTimestamp="2025-10-06 16:42:10 +0000 UTC" firstStartedPulling="2025-10-06 16:42:12.921881579 +0000 UTC m=+6530.077174141" lastFinishedPulling="2025-10-06 16:42:15.731153735 +0000 UTC m=+6532.886446247" observedRunningTime="2025-10-06 16:42:16.981963434 +0000 UTC m=+6534.137255956" watchObservedRunningTime="2025-10-06 16:42:16.987198976 +0000 UTC m=+6534.142491488" Oct 06 16:42:21 crc kubenswrapper[4763]: I1006 16:42:21.321750 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:21 crc kubenswrapper[4763]: I1006 16:42:21.322295 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:21 crc kubenswrapper[4763]: I1006 16:42:21.375471 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:22 crc kubenswrapper[4763]: I1006 16:42:22.123424 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:22 crc kubenswrapper[4763]: I1006 16:42:22.187250 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2ptgq"] Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.058537 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2ptgq" podUID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerName="registry-server" containerID="cri-o://a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f" gracePeriod=2 Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.643591 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.724434 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-utilities\") pod \"723c30ec-916d-4f88-9f67-c19af8d4b402\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.724528 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-catalog-content\") pod \"723c30ec-916d-4f88-9f67-c19af8d4b402\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.724606 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pcnx\" (UniqueName: \"kubernetes.io/projected/723c30ec-916d-4f88-9f67-c19af8d4b402-kube-api-access-5pcnx\") pod \"723c30ec-916d-4f88-9f67-c19af8d4b402\" (UID: \"723c30ec-916d-4f88-9f67-c19af8d4b402\") " Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.725951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-utilities" (OuterVolumeSpecName: "utilities") pod "723c30ec-916d-4f88-9f67-c19af8d4b402" (UID: "723c30ec-916d-4f88-9f67-c19af8d4b402"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.730988 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723c30ec-916d-4f88-9f67-c19af8d4b402-kube-api-access-5pcnx" (OuterVolumeSpecName: "kube-api-access-5pcnx") pod "723c30ec-916d-4f88-9f67-c19af8d4b402" (UID: "723c30ec-916d-4f88-9f67-c19af8d4b402"). InnerVolumeSpecName "kube-api-access-5pcnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.803594 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "723c30ec-916d-4f88-9f67-c19af8d4b402" (UID: "723c30ec-916d-4f88-9f67-c19af8d4b402"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.826827 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pcnx\" (UniqueName: \"kubernetes.io/projected/723c30ec-916d-4f88-9f67-c19af8d4b402-kube-api-access-5pcnx\") on node \"crc\" DevicePath \"\"" Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.826876 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:42:24 crc kubenswrapper[4763]: I1006 16:42:24.826888 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c30ec-916d-4f88-9f67-c19af8d4b402-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.071636 4763 generic.go:334] "Generic (PLEG): container finished" podID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerID="a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f" exitCode=0 Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.071690 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2ptgq" event={"ID":"723c30ec-916d-4f88-9f67-c19af8d4b402","Type":"ContainerDied","Data":"a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f"} Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.071727 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2ptgq" event={"ID":"723c30ec-916d-4f88-9f67-c19af8d4b402","Type":"ContainerDied","Data":"4f889cfc3aa8cb319d52610f5034cb22517583e4f20a787cbbdab49046f9d77b"} Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.071727 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2ptgq" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.071777 4763 scope.go:117] "RemoveContainer" containerID="a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.106726 4763 scope.go:117] "RemoveContainer" containerID="c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.128400 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2ptgq"] Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.140813 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2ptgq"] Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.153653 4763 scope.go:117] "RemoveContainer" containerID="084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.200695 4763 scope.go:117] "RemoveContainer" containerID="a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f" Oct 06 16:42:25 crc kubenswrapper[4763]: E1006 16:42:25.201172 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f\": container with ID starting with a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f not found: ID does not exist" containerID="a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.201227 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f"} err="failed to get container status \"a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f\": rpc error: code = NotFound desc = could not find container \"a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f\": container with ID starting with a378abf2471ad1acc8e96a8d1d4f460eaa7d1207aca739bf703fea66c22ee23f not found: ID does not exist" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.201257 4763 scope.go:117] "RemoveContainer" containerID="c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c" Oct 06 16:42:25 crc kubenswrapper[4763]: E1006 16:42:25.201754 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c\": container with ID starting with c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c not found: ID does not exist" containerID="c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.201793 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c"} err="failed to get container status \"c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c\": rpc error: code = NotFound desc = could not find container \"c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c\": container with ID starting with c7035267b6a0a6bb8dad317bcc075b0ac3d36cb2007be4e44b8dfdfefdfb382c not found: ID does not exist" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.201815 4763 scope.go:117] "RemoveContainer" containerID="084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a" Oct 06 16:42:25 crc kubenswrapper[4763]: E1006 16:42:25.202147 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a\": container with ID starting with 084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a not found: ID does not exist" containerID="084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.202190 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a"} err="failed to get container status \"084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a\": rpc error: code = NotFound desc = could not find container \"084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a\": container with ID starting with 084412bbab2f9d54008f4237806208070174df27dfcacc21ad788524a1be403a not found: ID does not exist" Oct 06 16:42:25 crc kubenswrapper[4763]: I1006 16:42:25.587109 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="723c30ec-916d-4f88-9f67-c19af8d4b402" path="/var/lib/kubelet/pods/723c30ec-916d-4f88-9f67-c19af8d4b402/volumes" Oct 06 16:43:51 crc kubenswrapper[4763]: I1006 16:43:51.049140 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-mm5f2"] Oct 06 16:43:51 crc kubenswrapper[4763]: I1006 16:43:51.061361 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-mm5f2"] Oct 06 16:43:51 crc kubenswrapper[4763]: I1006 16:43:51.598384 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea093121-dc49-4bf5-aae2-0539f4f51467" path="/var/lib/kubelet/pods/ea093121-dc49-4bf5-aae2-0539f4f51467/volumes" Oct 06 16:44:01 crc kubenswrapper[4763]: I1006 16:44:01.034853 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-3e6e-account-create-86nsx"] Oct 06 16:44:01 crc kubenswrapper[4763]: I1006 16:44:01.047664 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-3e6e-account-create-86nsx"] Oct 06 16:44:01 crc kubenswrapper[4763]: I1006 16:44:01.586698 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5444e6ef-05c2-4c2d-b742-03edf03cb800" path="/var/lib/kubelet/pods/5444e6ef-05c2-4c2d-b742-03edf03cb800/volumes" Oct 06 16:44:15 crc kubenswrapper[4763]: I1006 16:44:15.058703 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-9bzwg"] Oct 06 16:44:15 crc kubenswrapper[4763]: I1006 16:44:15.068742 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-9bzwg"] Oct 06 16:44:15 crc kubenswrapper[4763]: I1006 16:44:15.594144 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88647c36-cdad-4200-a1f6-e3d96d1492d9" path="/var/lib/kubelet/pods/88647c36-cdad-4200-a1f6-e3d96d1492d9/volumes" Oct 06 16:44:33 crc kubenswrapper[4763]: I1006 16:44:33.837059 4763 scope.go:117] "RemoveContainer" containerID="98de6adbe357c1533a41e7534659c6f8289365ae839544a75546d049181da255" Oct 06 16:44:33 crc kubenswrapper[4763]: I1006 16:44:33.867592 4763 scope.go:117] "RemoveContainer" containerID="22246f55807766136d966c83c08980810488df706ceae01aa2921e16f8edc1d5" Oct 06 16:44:33 crc kubenswrapper[4763]: I1006 16:44:33.876706 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:44:33 crc kubenswrapper[4763]: I1006 16:44:33.876784 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:44:33 crc kubenswrapper[4763]: I1006 16:44:33.937452 4763 scope.go:117] "RemoveContainer" containerID="d7e1e25ea32ccf8006fb4080e271239019e26451a6add1cccb17fd368b09182e" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.176422 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m"] Oct 06 16:45:00 crc kubenswrapper[4763]: E1006 16:45:00.177850 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerName="registry-server" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.177876 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerName="registry-server" Oct 06 16:45:00 crc kubenswrapper[4763]: E1006 16:45:00.177902 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerName="extract-content" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.177914 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerName="extract-content" Oct 06 16:45:00 crc kubenswrapper[4763]: E1006 16:45:00.177946 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerName="extract-utilities" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.177959 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerName="extract-utilities" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.178415 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="723c30ec-916d-4f88-9f67-c19af8d4b402" containerName="registry-server" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.181908 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.185078 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.185334 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m"] Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.185406 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.291621 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jjgp\" (UniqueName: \"kubernetes.io/projected/6d3543f2-3b03-4162-9e84-f60efd4327ce-kube-api-access-8jjgp\") pod \"collect-profiles-29329485-s8m7m\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.292304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d3543f2-3b03-4162-9e84-f60efd4327ce-secret-volume\") pod \"collect-profiles-29329485-s8m7m\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.292496 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d3543f2-3b03-4162-9e84-f60efd4327ce-config-volume\") pod \"collect-profiles-29329485-s8m7m\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.394552 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jjgp\" (UniqueName: \"kubernetes.io/projected/6d3543f2-3b03-4162-9e84-f60efd4327ce-kube-api-access-8jjgp\") pod \"collect-profiles-29329485-s8m7m\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.395023 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d3543f2-3b03-4162-9e84-f60efd4327ce-secret-volume\") pod \"collect-profiles-29329485-s8m7m\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.395135 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d3543f2-3b03-4162-9e84-f60efd4327ce-config-volume\") pod \"collect-profiles-29329485-s8m7m\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.396396 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d3543f2-3b03-4162-9e84-f60efd4327ce-config-volume\") pod \"collect-profiles-29329485-s8m7m\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.404027 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d3543f2-3b03-4162-9e84-f60efd4327ce-secret-volume\") pod \"collect-profiles-29329485-s8m7m\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.419632 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jjgp\" (UniqueName: \"kubernetes.io/projected/6d3543f2-3b03-4162-9e84-f60efd4327ce-kube-api-access-8jjgp\") pod \"collect-profiles-29329485-s8m7m\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.501867 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:00 crc kubenswrapper[4763]: I1006 16:45:00.980738 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m"] Oct 06 16:45:01 crc kubenswrapper[4763]: I1006 16:45:01.868827 4763 generic.go:334] "Generic (PLEG): container finished" podID="6d3543f2-3b03-4162-9e84-f60efd4327ce" containerID="1773fa9e263a2eb86d181ba57e3db72b7416c94978ecfd098e5cf34e97afb51f" exitCode=0 Oct 06 16:45:01 crc kubenswrapper[4763]: I1006 16:45:01.869205 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" event={"ID":"6d3543f2-3b03-4162-9e84-f60efd4327ce","Type":"ContainerDied","Data":"1773fa9e263a2eb86d181ba57e3db72b7416c94978ecfd098e5cf34e97afb51f"} Oct 06 16:45:01 crc kubenswrapper[4763]: I1006 16:45:01.869229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" event={"ID":"6d3543f2-3b03-4162-9e84-f60efd4327ce","Type":"ContainerStarted","Data":"48840c3f0ac8832ab8bda055e48343d0b478c9d58646583a6af44ddcfc6bb09b"} Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.280685 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.365120 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d3543f2-3b03-4162-9e84-f60efd4327ce-secret-volume\") pod \"6d3543f2-3b03-4162-9e84-f60efd4327ce\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.365744 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d3543f2-3b03-4162-9e84-f60efd4327ce-config-volume\") pod \"6d3543f2-3b03-4162-9e84-f60efd4327ce\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.366125 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jjgp\" (UniqueName: \"kubernetes.io/projected/6d3543f2-3b03-4162-9e84-f60efd4327ce-kube-api-access-8jjgp\") pod \"6d3543f2-3b03-4162-9e84-f60efd4327ce\" (UID: \"6d3543f2-3b03-4162-9e84-f60efd4327ce\") " Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.366462 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d3543f2-3b03-4162-9e84-f60efd4327ce-config-volume" (OuterVolumeSpecName: "config-volume") pod "6d3543f2-3b03-4162-9e84-f60efd4327ce" (UID: "6d3543f2-3b03-4162-9e84-f60efd4327ce"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.367580 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d3543f2-3b03-4162-9e84-f60efd4327ce-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.371439 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3543f2-3b03-4162-9e84-f60efd4327ce-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6d3543f2-3b03-4162-9e84-f60efd4327ce" (UID: "6d3543f2-3b03-4162-9e84-f60efd4327ce"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.372461 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d3543f2-3b03-4162-9e84-f60efd4327ce-kube-api-access-8jjgp" (OuterVolumeSpecName: "kube-api-access-8jjgp") pod "6d3543f2-3b03-4162-9e84-f60efd4327ce" (UID: "6d3543f2-3b03-4162-9e84-f60efd4327ce"). InnerVolumeSpecName "kube-api-access-8jjgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.469951 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jjgp\" (UniqueName: \"kubernetes.io/projected/6d3543f2-3b03-4162-9e84-f60efd4327ce-kube-api-access-8jjgp\") on node \"crc\" DevicePath \"\"" Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.470197 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d3543f2-3b03-4162-9e84-f60efd4327ce-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.876972 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.877292 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.892553 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" event={"ID":"6d3543f2-3b03-4162-9e84-f60efd4327ce","Type":"ContainerDied","Data":"48840c3f0ac8832ab8bda055e48343d0b478c9d58646583a6af44ddcfc6bb09b"} Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.892591 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329485-s8m7m" Oct 06 16:45:03 crc kubenswrapper[4763]: I1006 16:45:03.892597 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48840c3f0ac8832ab8bda055e48343d0b478c9d58646583a6af44ddcfc6bb09b" Oct 06 16:45:04 crc kubenswrapper[4763]: I1006 16:45:04.394193 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck"] Oct 06 16:45:04 crc kubenswrapper[4763]: I1006 16:45:04.405084 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329440-dxjck"] Oct 06 16:45:05 crc kubenswrapper[4763]: I1006 16:45:05.594753 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf38ef16-1123-4994-aea6-fcf552779957" path="/var/lib/kubelet/pods/cf38ef16-1123-4994-aea6-fcf552779957/volumes" Oct 06 16:45:33 crc kubenswrapper[4763]: I1006 16:45:33.876635 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:45:33 crc kubenswrapper[4763]: I1006 16:45:33.877204 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:45:33 crc kubenswrapper[4763]: I1006 16:45:33.877252 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 16:45:33 crc kubenswrapper[4763]: I1006 16:45:33.878110 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 16:45:33 crc kubenswrapper[4763]: I1006 16:45:33.878163 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" gracePeriod=600 Oct 06 16:45:34 crc kubenswrapper[4763]: E1006 16:45:34.008018 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:45:34 crc kubenswrapper[4763]: I1006 16:45:34.072736 4763 scope.go:117] "RemoveContainer" containerID="40a757256c9198cd9bec8d3945ba649f98e41777aa43b0f6acbcac3a8e9455b8" Oct 06 16:45:34 crc kubenswrapper[4763]: I1006 16:45:34.305001 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" exitCode=0 Oct 06 16:45:34 crc kubenswrapper[4763]: I1006 16:45:34.305064 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22"} Oct 06 16:45:34 crc kubenswrapper[4763]: I1006 16:45:34.305124 4763 scope.go:117] "RemoveContainer" containerID="854ab145110d0f5a70c5d9b9abb6d45fd5cb2bb06709d04a1b016ba93727c81a" Oct 06 16:45:34 crc kubenswrapper[4763]: I1006 16:45:34.305876 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:45:34 crc kubenswrapper[4763]: E1006 16:45:34.306193 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:45:44 crc kubenswrapper[4763]: I1006 16:45:44.575647 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:45:44 crc kubenswrapper[4763]: E1006 16:45:44.576460 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:45:57 crc kubenswrapper[4763]: I1006 16:45:57.575159 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:45:57 crc kubenswrapper[4763]: E1006 16:45:57.575842 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:46:11 crc kubenswrapper[4763]: I1006 16:46:11.575569 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:46:11 crc kubenswrapper[4763]: E1006 16:46:11.576423 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:46:22 crc kubenswrapper[4763]: I1006 16:46:22.059706 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-xqxl8"] Oct 06 16:46:22 crc kubenswrapper[4763]: I1006 16:46:22.075297 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-xqxl8"] Oct 06 16:46:23 crc kubenswrapper[4763]: I1006 16:46:23.591863 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb" path="/var/lib/kubelet/pods/c9e9dc26-fdf2-4ffe-9fec-c38fa0fb2dfb/volumes" Oct 06 16:46:25 crc kubenswrapper[4763]: I1006 16:46:25.575818 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:46:25 crc kubenswrapper[4763]: E1006 16:46:25.576992 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:46:32 crc kubenswrapper[4763]: I1006 16:46:32.030607 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-217d-account-create-9pkbx"] Oct 06 16:46:32 crc kubenswrapper[4763]: I1006 16:46:32.041726 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-217d-account-create-9pkbx"] Oct 06 16:46:33 crc kubenswrapper[4763]: I1006 16:46:33.587852 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f306383-e72b-4e5a-a010-f4685d581d95" path="/var/lib/kubelet/pods/9f306383-e72b-4e5a-a010-f4685d581d95/volumes" Oct 06 16:46:34 crc kubenswrapper[4763]: I1006 16:46:34.144646 4763 scope.go:117] "RemoveContainer" containerID="33cc4a63bf00c0a7aa0c06650fdb78047b399b4705ef94328708ffd5783bb0f3" Oct 06 16:46:34 crc kubenswrapper[4763]: I1006 16:46:34.179878 4763 scope.go:117] "RemoveContainer" containerID="2e00963f59a29db40a3659d85b5ad8336eea032bb8ff52b5603432223da99cb4" Oct 06 16:46:40 crc kubenswrapper[4763]: I1006 16:46:40.574687 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:46:40 crc kubenswrapper[4763]: E1006 16:46:40.575435 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:46:44 crc kubenswrapper[4763]: I1006 16:46:44.059040 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-jd6sn"] Oct 06 16:46:44 crc kubenswrapper[4763]: I1006 16:46:44.075470 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-jd6sn"] Oct 06 16:46:45 crc kubenswrapper[4763]: I1006 16:46:45.596741 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d58816bc-70e9-4dfa-93ea-fc53e07fa77e" path="/var/lib/kubelet/pods/d58816bc-70e9-4dfa-93ea-fc53e07fa77e/volumes" Oct 06 16:46:54 crc kubenswrapper[4763]: I1006 16:46:54.575728 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:46:54 crc kubenswrapper[4763]: E1006 16:46:54.576644 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:47:05 crc kubenswrapper[4763]: I1006 16:47:05.040162 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-xf4dm"] Oct 06 16:47:05 crc kubenswrapper[4763]: I1006 16:47:05.049394 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-xf4dm"] Oct 06 16:47:05 crc kubenswrapper[4763]: I1006 16:47:05.593232 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d75a0351-94d8-4e4f-9ea0-74ccc0b74937" path="/var/lib/kubelet/pods/d75a0351-94d8-4e4f-9ea0-74ccc0b74937/volumes" Oct 06 16:47:06 crc kubenswrapper[4763]: I1006 16:47:06.575977 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:47:06 crc kubenswrapper[4763]: E1006 16:47:06.576490 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:47:12 crc kubenswrapper[4763]: I1006 16:47:12.793501 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fxggj"] Oct 06 16:47:12 crc kubenswrapper[4763]: E1006 16:47:12.796317 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3543f2-3b03-4162-9e84-f60efd4327ce" containerName="collect-profiles" Oct 06 16:47:12 crc kubenswrapper[4763]: I1006 16:47:12.796425 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3543f2-3b03-4162-9e84-f60efd4327ce" containerName="collect-profiles" Oct 06 16:47:12 crc kubenswrapper[4763]: I1006 16:47:12.796722 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3543f2-3b03-4162-9e84-f60efd4327ce" containerName="collect-profiles" Oct 06 16:47:12 crc kubenswrapper[4763]: I1006 16:47:12.798337 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:12 crc kubenswrapper[4763]: I1006 16:47:12.810240 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxggj"] Oct 06 16:47:12 crc kubenswrapper[4763]: I1006 16:47:12.949093 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-utilities\") pod \"redhat-marketplace-fxggj\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:12 crc kubenswrapper[4763]: I1006 16:47:12.949247 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlpkx\" (UniqueName: \"kubernetes.io/projected/8ddbef03-91ca-43b6-91f0-677bd66fedae-kube-api-access-nlpkx\") pod \"redhat-marketplace-fxggj\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:12 crc kubenswrapper[4763]: I1006 16:47:12.949282 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-catalog-content\") pod \"redhat-marketplace-fxggj\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:13 crc kubenswrapper[4763]: I1006 16:47:13.051686 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-utilities\") pod \"redhat-marketplace-fxggj\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:13 crc kubenswrapper[4763]: I1006 16:47:13.051839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlpkx\" (UniqueName: \"kubernetes.io/projected/8ddbef03-91ca-43b6-91f0-677bd66fedae-kube-api-access-nlpkx\") pod \"redhat-marketplace-fxggj\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:13 crc kubenswrapper[4763]: I1006 16:47:13.051880 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-catalog-content\") pod \"redhat-marketplace-fxggj\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:13 crc kubenswrapper[4763]: I1006 16:47:13.052391 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-utilities\") pod \"redhat-marketplace-fxggj\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:13 crc kubenswrapper[4763]: I1006 16:47:13.052467 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-catalog-content\") pod \"redhat-marketplace-fxggj\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:13 crc kubenswrapper[4763]: I1006 16:47:13.076758 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlpkx\" (UniqueName: \"kubernetes.io/projected/8ddbef03-91ca-43b6-91f0-677bd66fedae-kube-api-access-nlpkx\") pod \"redhat-marketplace-fxggj\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:13 crc kubenswrapper[4763]: I1006 16:47:13.168992 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:13 crc kubenswrapper[4763]: I1006 16:47:13.677269 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxggj"] Oct 06 16:47:14 crc kubenswrapper[4763]: I1006 16:47:14.441108 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerID="e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966" exitCode=0 Oct 06 16:47:14 crc kubenswrapper[4763]: I1006 16:47:14.441180 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxggj" event={"ID":"8ddbef03-91ca-43b6-91f0-677bd66fedae","Type":"ContainerDied","Data":"e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966"} Oct 06 16:47:14 crc kubenswrapper[4763]: I1006 16:47:14.441445 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxggj" event={"ID":"8ddbef03-91ca-43b6-91f0-677bd66fedae","Type":"ContainerStarted","Data":"4512fba2f2600ab95540575151f8baf42489df49b83362813177657e41d1812f"} Oct 06 16:47:14 crc kubenswrapper[4763]: I1006 16:47:14.444350 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 16:47:15 crc kubenswrapper[4763]: I1006 16:47:15.043016 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-6aa4-account-create-77nz4"] Oct 06 16:47:15 crc kubenswrapper[4763]: I1006 16:47:15.050217 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-6aa4-account-create-77nz4"] Oct 06 16:47:15 crc kubenswrapper[4763]: I1006 16:47:15.456589 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxggj" event={"ID":"8ddbef03-91ca-43b6-91f0-677bd66fedae","Type":"ContainerStarted","Data":"75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4"} Oct 06 16:47:15 crc kubenswrapper[4763]: I1006 16:47:15.587218 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49af2895-6886-419f-84c0-e92cd4196c4f" path="/var/lib/kubelet/pods/49af2895-6886-419f-84c0-e92cd4196c4f/volumes" Oct 06 16:47:16 crc kubenswrapper[4763]: I1006 16:47:16.472859 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerID="75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4" exitCode=0 Oct 06 16:47:16 crc kubenswrapper[4763]: I1006 16:47:16.472975 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxggj" event={"ID":"8ddbef03-91ca-43b6-91f0-677bd66fedae","Type":"ContainerDied","Data":"75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4"} Oct 06 16:47:17 crc kubenswrapper[4763]: I1006 16:47:17.484689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxggj" event={"ID":"8ddbef03-91ca-43b6-91f0-677bd66fedae","Type":"ContainerStarted","Data":"afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d"} Oct 06 16:47:17 crc kubenswrapper[4763]: I1006 16:47:17.509527 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fxggj" podStartSLOduration=3.000093497 podStartE2EDuration="5.509506337s" podCreationTimestamp="2025-10-06 16:47:12 +0000 UTC" firstStartedPulling="2025-10-06 16:47:14.444072752 +0000 UTC m=+6831.599365274" lastFinishedPulling="2025-10-06 16:47:16.953485602 +0000 UTC m=+6834.108778114" observedRunningTime="2025-10-06 16:47:17.505279672 +0000 UTC m=+6834.660572204" watchObservedRunningTime="2025-10-06 16:47:17.509506337 +0000 UTC m=+6834.664798849" Oct 06 16:47:20 crc kubenswrapper[4763]: I1006 16:47:20.575495 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:47:20 crc kubenswrapper[4763]: E1006 16:47:20.576462 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:47:23 crc kubenswrapper[4763]: I1006 16:47:23.169389 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:23 crc kubenswrapper[4763]: I1006 16:47:23.169820 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:23 crc kubenswrapper[4763]: I1006 16:47:23.227786 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:23 crc kubenswrapper[4763]: I1006 16:47:23.624632 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:23 crc kubenswrapper[4763]: I1006 16:47:23.693003 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxggj"] Oct 06 16:47:25 crc kubenswrapper[4763]: I1006 16:47:25.571642 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fxggj" podUID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerName="registry-server" containerID="cri-o://afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d" gracePeriod=2 Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.088080 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.177717 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlpkx\" (UniqueName: \"kubernetes.io/projected/8ddbef03-91ca-43b6-91f0-677bd66fedae-kube-api-access-nlpkx\") pod \"8ddbef03-91ca-43b6-91f0-677bd66fedae\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.177866 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-catalog-content\") pod \"8ddbef03-91ca-43b6-91f0-677bd66fedae\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.177926 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-utilities\") pod \"8ddbef03-91ca-43b6-91f0-677bd66fedae\" (UID: \"8ddbef03-91ca-43b6-91f0-677bd66fedae\") " Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.179323 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-utilities" (OuterVolumeSpecName: "utilities") pod "8ddbef03-91ca-43b6-91f0-677bd66fedae" (UID: "8ddbef03-91ca-43b6-91f0-677bd66fedae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.184173 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ddbef03-91ca-43b6-91f0-677bd66fedae-kube-api-access-nlpkx" (OuterVolumeSpecName: "kube-api-access-nlpkx") pod "8ddbef03-91ca-43b6-91f0-677bd66fedae" (UID: "8ddbef03-91ca-43b6-91f0-677bd66fedae"). InnerVolumeSpecName "kube-api-access-nlpkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.189587 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ddbef03-91ca-43b6-91f0-677bd66fedae" (UID: "8ddbef03-91ca-43b6-91f0-677bd66fedae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.280975 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlpkx\" (UniqueName: \"kubernetes.io/projected/8ddbef03-91ca-43b6-91f0-677bd66fedae-kube-api-access-nlpkx\") on node \"crc\" DevicePath \"\"" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.281005 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.281015 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ddbef03-91ca-43b6-91f0-677bd66fedae-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.593078 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerID="afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d" exitCode=0 Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.593113 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxggj" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.593135 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxggj" event={"ID":"8ddbef03-91ca-43b6-91f0-677bd66fedae","Type":"ContainerDied","Data":"afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d"} Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.593950 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxggj" event={"ID":"8ddbef03-91ca-43b6-91f0-677bd66fedae","Type":"ContainerDied","Data":"4512fba2f2600ab95540575151f8baf42489df49b83362813177657e41d1812f"} Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.593985 4763 scope.go:117] "RemoveContainer" containerID="afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.651930 4763 scope.go:117] "RemoveContainer" containerID="75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.662052 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxggj"] Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.684755 4763 scope.go:117] "RemoveContainer" containerID="e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.690763 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxggj"] Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.725541 4763 scope.go:117] "RemoveContainer" containerID="afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d" Oct 06 16:47:26 crc kubenswrapper[4763]: E1006 16:47:26.725991 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d\": container with ID starting with afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d not found: ID does not exist" containerID="afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.726024 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d"} err="failed to get container status \"afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d\": rpc error: code = NotFound desc = could not find container \"afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d\": container with ID starting with afaf0c5e8e67f2ba243d953d57fa7a60c3710ca4586997250b3156682c3b2f0d not found: ID does not exist" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.726046 4763 scope.go:117] "RemoveContainer" containerID="75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4" Oct 06 16:47:26 crc kubenswrapper[4763]: E1006 16:47:26.726347 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4\": container with ID starting with 75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4 not found: ID does not exist" containerID="75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.726392 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4"} err="failed to get container status \"75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4\": rpc error: code = NotFound desc = could not find container \"75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4\": container with ID starting with 75942a031a7bf9b427bc7f3ade1f3b94c4069935ae382f668bffc0a04bd51ab4 not found: ID does not exist" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.726424 4763 scope.go:117] "RemoveContainer" containerID="e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966" Oct 06 16:47:26 crc kubenswrapper[4763]: E1006 16:47:26.726679 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966\": container with ID starting with e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966 not found: ID does not exist" containerID="e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966" Oct 06 16:47:26 crc kubenswrapper[4763]: I1006 16:47:26.726706 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966"} err="failed to get container status \"e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966\": rpc error: code = NotFound desc = could not find container \"e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966\": container with ID starting with e111cd00977947ad55441dca15d4f089f8cfbf01fef9018af9aa70965a8e2966 not found: ID does not exist" Oct 06 16:47:27 crc kubenswrapper[4763]: I1006 16:47:27.594978 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ddbef03-91ca-43b6-91f0-677bd66fedae" path="/var/lib/kubelet/pods/8ddbef03-91ca-43b6-91f0-677bd66fedae/volumes" Oct 06 16:47:28 crc kubenswrapper[4763]: I1006 16:47:28.043687 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-x976s"] Oct 06 16:47:28 crc kubenswrapper[4763]: I1006 16:47:28.050919 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-x976s"] Oct 06 16:47:29 crc kubenswrapper[4763]: I1006 16:47:29.597835 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eccda7d-b9ac-464b-bf97-3281dca23b6c" path="/var/lib/kubelet/pods/6eccda7d-b9ac-464b-bf97-3281dca23b6c/volumes" Oct 06 16:47:34 crc kubenswrapper[4763]: I1006 16:47:34.309935 4763 scope.go:117] "RemoveContainer" containerID="c4accdecbc119e688ba398fb2853dcda6feca22aa6328c6351255af0ccd9211a" Oct 06 16:47:34 crc kubenswrapper[4763]: I1006 16:47:34.352741 4763 scope.go:117] "RemoveContainer" containerID="b9e96b884f2f1dcfc8a175f97aa0f8cc90a16ff704b2a8541031e75ef26b3f82" Oct 06 16:47:34 crc kubenswrapper[4763]: I1006 16:47:34.436510 4763 scope.go:117] "RemoveContainer" containerID="b51009a86e9f1c203b046d0b42717d6b63ba05e88ddd87225deb993660c3b70b" Oct 06 16:47:34 crc kubenswrapper[4763]: I1006 16:47:34.475723 4763 scope.go:117] "RemoveContainer" containerID="375aa3e0556c8ac0e6eae5c5d158c399b33f77cedbd7772f523e9f72a2ab123d" Oct 06 16:47:34 crc kubenswrapper[4763]: I1006 16:47:34.574729 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:47:34 crc kubenswrapper[4763]: E1006 16:47:34.575058 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:47:49 crc kubenswrapper[4763]: I1006 16:47:49.575785 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:47:49 crc kubenswrapper[4763]: E1006 16:47:49.576745 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.790422 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9hw7g"] Oct 06 16:47:53 crc kubenswrapper[4763]: E1006 16:47:53.791710 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerName="extract-utilities" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.791732 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerName="extract-utilities" Oct 06 16:47:53 crc kubenswrapper[4763]: E1006 16:47:53.791773 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerName="extract-content" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.791787 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerName="extract-content" Oct 06 16:47:53 crc kubenswrapper[4763]: E1006 16:47:53.791835 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerName="registry-server" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.791848 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerName="registry-server" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.792260 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ddbef03-91ca-43b6-91f0-677bd66fedae" containerName="registry-server" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.795792 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.815739 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9hw7g"] Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.861489 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkdgz\" (UniqueName: \"kubernetes.io/projected/1299667d-97f7-4c2a-9d18-b48021550313-kube-api-access-pkdgz\") pod \"redhat-operators-9hw7g\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.863941 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-catalog-content\") pod \"redhat-operators-9hw7g\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.864169 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-utilities\") pod \"redhat-operators-9hw7g\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.969866 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-utilities\") pod \"redhat-operators-9hw7g\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.970210 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkdgz\" (UniqueName: \"kubernetes.io/projected/1299667d-97f7-4c2a-9d18-b48021550313-kube-api-access-pkdgz\") pod \"redhat-operators-9hw7g\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.970376 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-catalog-content\") pod \"redhat-operators-9hw7g\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.970517 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-utilities\") pod \"redhat-operators-9hw7g\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.970870 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-catalog-content\") pod \"redhat-operators-9hw7g\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:53 crc kubenswrapper[4763]: I1006 16:47:53.991694 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkdgz\" (UniqueName: \"kubernetes.io/projected/1299667d-97f7-4c2a-9d18-b48021550313-kube-api-access-pkdgz\") pod \"redhat-operators-9hw7g\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:54 crc kubenswrapper[4763]: I1006 16:47:54.125219 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:47:54 crc kubenswrapper[4763]: I1006 16:47:54.652080 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9hw7g"] Oct 06 16:47:54 crc kubenswrapper[4763]: I1006 16:47:54.950247 4763 generic.go:334] "Generic (PLEG): container finished" podID="1299667d-97f7-4c2a-9d18-b48021550313" containerID="ad0dd28bb3fe3cb30a4582d883753c9090f6e8b9d102a263ac80840e508d0e39" exitCode=0 Oct 06 16:47:54 crc kubenswrapper[4763]: I1006 16:47:54.950358 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hw7g" event={"ID":"1299667d-97f7-4c2a-9d18-b48021550313","Type":"ContainerDied","Data":"ad0dd28bb3fe3cb30a4582d883753c9090f6e8b9d102a263ac80840e508d0e39"} Oct 06 16:47:54 crc kubenswrapper[4763]: I1006 16:47:54.950539 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hw7g" event={"ID":"1299667d-97f7-4c2a-9d18-b48021550313","Type":"ContainerStarted","Data":"c63eae3bf38761b1d3baefe64147089d8fb60cf0f8ee0c910e2058faa93499d2"} Oct 06 16:47:56 crc kubenswrapper[4763]: I1006 16:47:56.973594 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hw7g" event={"ID":"1299667d-97f7-4c2a-9d18-b48021550313","Type":"ContainerStarted","Data":"7341ed2730f0922628ee259c66d5fd670c30ec8295ad19ae6e26b2afdd03679c"} Oct 06 16:48:00 crc kubenswrapper[4763]: I1006 16:48:00.010643 4763 generic.go:334] "Generic (PLEG): container finished" podID="1299667d-97f7-4c2a-9d18-b48021550313" containerID="7341ed2730f0922628ee259c66d5fd670c30ec8295ad19ae6e26b2afdd03679c" exitCode=0 Oct 06 16:48:00 crc kubenswrapper[4763]: I1006 16:48:00.010774 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hw7g" event={"ID":"1299667d-97f7-4c2a-9d18-b48021550313","Type":"ContainerDied","Data":"7341ed2730f0922628ee259c66d5fd670c30ec8295ad19ae6e26b2afdd03679c"} Oct 06 16:48:02 crc kubenswrapper[4763]: I1006 16:48:02.043608 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hw7g" event={"ID":"1299667d-97f7-4c2a-9d18-b48021550313","Type":"ContainerStarted","Data":"928b0db79c9254550f98117514cc6d448238f560e88a4828d47beb05aa738869"} Oct 06 16:48:02 crc kubenswrapper[4763]: I1006 16:48:02.072298 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9hw7g" podStartSLOduration=3.20482066 podStartE2EDuration="9.072279665s" podCreationTimestamp="2025-10-06 16:47:53 +0000 UTC" firstStartedPulling="2025-10-06 16:47:54.951888374 +0000 UTC m=+6872.107180886" lastFinishedPulling="2025-10-06 16:48:00.819347379 +0000 UTC m=+6877.974639891" observedRunningTime="2025-10-06 16:48:02.066548139 +0000 UTC m=+6879.221840651" watchObservedRunningTime="2025-10-06 16:48:02.072279665 +0000 UTC m=+6879.227572167" Oct 06 16:48:04 crc kubenswrapper[4763]: I1006 16:48:04.126360 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:48:04 crc kubenswrapper[4763]: I1006 16:48:04.126636 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:48:04 crc kubenswrapper[4763]: I1006 16:48:04.575198 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:48:04 crc kubenswrapper[4763]: E1006 16:48:04.575849 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.181979 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9hw7g" podUID="1299667d-97f7-4c2a-9d18-b48021550313" containerName="registry-server" probeResult="failure" output=< Oct 06 16:48:05 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Oct 06 16:48:05 crc kubenswrapper[4763]: > Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.695328 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k6bbr"] Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.709018 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.709190 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6bbr"] Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.782559 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j44xh\" (UniqueName: \"kubernetes.io/projected/3ce7741a-ee57-41fe-9c36-30562b802903-kube-api-access-j44xh\") pod \"certified-operators-k6bbr\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.782791 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-utilities\") pod \"certified-operators-k6bbr\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.782859 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-catalog-content\") pod \"certified-operators-k6bbr\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.885215 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j44xh\" (UniqueName: \"kubernetes.io/projected/3ce7741a-ee57-41fe-9c36-30562b802903-kube-api-access-j44xh\") pod \"certified-operators-k6bbr\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.885303 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-utilities\") pod \"certified-operators-k6bbr\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.885333 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-catalog-content\") pod \"certified-operators-k6bbr\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.885813 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-catalog-content\") pod \"certified-operators-k6bbr\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.886320 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-utilities\") pod \"certified-operators-k6bbr\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:05 crc kubenswrapper[4763]: I1006 16:48:05.912085 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j44xh\" (UniqueName: \"kubernetes.io/projected/3ce7741a-ee57-41fe-9c36-30562b802903-kube-api-access-j44xh\") pod \"certified-operators-k6bbr\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:06 crc kubenswrapper[4763]: I1006 16:48:06.042760 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:06 crc kubenswrapper[4763]: I1006 16:48:06.649041 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6bbr"] Oct 06 16:48:06 crc kubenswrapper[4763]: W1006 16:48:06.661941 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ce7741a_ee57_41fe_9c36_30562b802903.slice/crio-07d54a7792ed4f7d421f5dc3171cd30093c689f18d551437a3f140cea6d264a3 WatchSource:0}: Error finding container 07d54a7792ed4f7d421f5dc3171cd30093c689f18d551437a3f140cea6d264a3: Status 404 returned error can't find the container with id 07d54a7792ed4f7d421f5dc3171cd30093c689f18d551437a3f140cea6d264a3 Oct 06 16:48:07 crc kubenswrapper[4763]: I1006 16:48:07.120932 4763 generic.go:334] "Generic (PLEG): container finished" podID="3ce7741a-ee57-41fe-9c36-30562b802903" containerID="80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d" exitCode=0 Oct 06 16:48:07 crc kubenswrapper[4763]: I1006 16:48:07.121258 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6bbr" event={"ID":"3ce7741a-ee57-41fe-9c36-30562b802903","Type":"ContainerDied","Data":"80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d"} Oct 06 16:48:07 crc kubenswrapper[4763]: I1006 16:48:07.121303 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6bbr" event={"ID":"3ce7741a-ee57-41fe-9c36-30562b802903","Type":"ContainerStarted","Data":"07d54a7792ed4f7d421f5dc3171cd30093c689f18d551437a3f140cea6d264a3"} Oct 06 16:48:08 crc kubenswrapper[4763]: I1006 16:48:08.141363 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6bbr" event={"ID":"3ce7741a-ee57-41fe-9c36-30562b802903","Type":"ContainerStarted","Data":"966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2"} Oct 06 16:48:10 crc kubenswrapper[4763]: I1006 16:48:10.162660 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6bbr" event={"ID":"3ce7741a-ee57-41fe-9c36-30562b802903","Type":"ContainerDied","Data":"966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2"} Oct 06 16:48:10 crc kubenswrapper[4763]: I1006 16:48:10.163300 4763 generic.go:334] "Generic (PLEG): container finished" podID="3ce7741a-ee57-41fe-9c36-30562b802903" containerID="966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2" exitCode=0 Oct 06 16:48:11 crc kubenswrapper[4763]: I1006 16:48:11.179429 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6bbr" event={"ID":"3ce7741a-ee57-41fe-9c36-30562b802903","Type":"ContainerStarted","Data":"a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e"} Oct 06 16:48:11 crc kubenswrapper[4763]: I1006 16:48:11.217356 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k6bbr" podStartSLOduration=2.671956724 podStartE2EDuration="6.217319527s" podCreationTimestamp="2025-10-06 16:48:05 +0000 UTC" firstStartedPulling="2025-10-06 16:48:07.124588095 +0000 UTC m=+6884.279880647" lastFinishedPulling="2025-10-06 16:48:10.669950928 +0000 UTC m=+6887.825243450" observedRunningTime="2025-10-06 16:48:11.200800029 +0000 UTC m=+6888.356092581" watchObservedRunningTime="2025-10-06 16:48:11.217319527 +0000 UTC m=+6888.372612079" Oct 06 16:48:14 crc kubenswrapper[4763]: I1006 16:48:14.211434 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:48:14 crc kubenswrapper[4763]: I1006 16:48:14.291169 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:48:16 crc kubenswrapper[4763]: I1006 16:48:16.043717 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:16 crc kubenswrapper[4763]: I1006 16:48:16.044176 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:16 crc kubenswrapper[4763]: I1006 16:48:16.115942 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:16 crc kubenswrapper[4763]: I1006 16:48:16.308021 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:17 crc kubenswrapper[4763]: I1006 16:48:17.485261 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6bbr"] Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.259549 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k6bbr" podUID="3ce7741a-ee57-41fe-9c36-30562b802903" containerName="registry-server" containerID="cri-o://a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e" gracePeriod=2 Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.578499 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:48:18 crc kubenswrapper[4763]: E1006 16:48:18.579454 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.862750 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.892353 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9hw7g"] Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.892710 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9hw7g" podUID="1299667d-97f7-4c2a-9d18-b48021550313" containerName="registry-server" containerID="cri-o://928b0db79c9254550f98117514cc6d448238f560e88a4828d47beb05aa738869" gracePeriod=2 Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.896567 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j44xh\" (UniqueName: \"kubernetes.io/projected/3ce7741a-ee57-41fe-9c36-30562b802903-kube-api-access-j44xh\") pod \"3ce7741a-ee57-41fe-9c36-30562b802903\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.896737 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-catalog-content\") pod \"3ce7741a-ee57-41fe-9c36-30562b802903\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.896822 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-utilities\") pod \"3ce7741a-ee57-41fe-9c36-30562b802903\" (UID: \"3ce7741a-ee57-41fe-9c36-30562b802903\") " Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.897874 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-utilities" (OuterVolumeSpecName: "utilities") pod "3ce7741a-ee57-41fe-9c36-30562b802903" (UID: "3ce7741a-ee57-41fe-9c36-30562b802903"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.912040 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce7741a-ee57-41fe-9c36-30562b802903-kube-api-access-j44xh" (OuterVolumeSpecName: "kube-api-access-j44xh") pod "3ce7741a-ee57-41fe-9c36-30562b802903" (UID: "3ce7741a-ee57-41fe-9c36-30562b802903"). InnerVolumeSpecName "kube-api-access-j44xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:48:18 crc kubenswrapper[4763]: I1006 16:48:18.970497 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ce7741a-ee57-41fe-9c36-30562b802903" (UID: "3ce7741a-ee57-41fe-9c36-30562b802903"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.000017 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.000065 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ce7741a-ee57-41fe-9c36-30562b802903-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.000083 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j44xh\" (UniqueName: \"kubernetes.io/projected/3ce7741a-ee57-41fe-9c36-30562b802903-kube-api-access-j44xh\") on node \"crc\" DevicePath \"\"" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.272355 4763 generic.go:334] "Generic (PLEG): container finished" podID="3ce7741a-ee57-41fe-9c36-30562b802903" containerID="a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e" exitCode=0 Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.272427 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6bbr" event={"ID":"3ce7741a-ee57-41fe-9c36-30562b802903","Type":"ContainerDied","Data":"a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e"} Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.272474 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6bbr" event={"ID":"3ce7741a-ee57-41fe-9c36-30562b802903","Type":"ContainerDied","Data":"07d54a7792ed4f7d421f5dc3171cd30093c689f18d551437a3f140cea6d264a3"} Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.272492 4763 scope.go:117] "RemoveContainer" containerID="a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.272443 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6bbr" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.277062 4763 generic.go:334] "Generic (PLEG): container finished" podID="1299667d-97f7-4c2a-9d18-b48021550313" containerID="928b0db79c9254550f98117514cc6d448238f560e88a4828d47beb05aa738869" exitCode=0 Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.277100 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hw7g" event={"ID":"1299667d-97f7-4c2a-9d18-b48021550313","Type":"ContainerDied","Data":"928b0db79c9254550f98117514cc6d448238f560e88a4828d47beb05aa738869"} Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.307287 4763 scope.go:117] "RemoveContainer" containerID="966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.309375 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6bbr"] Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.318540 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k6bbr"] Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.330346 4763 scope.go:117] "RemoveContainer" containerID="80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.347708 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.355157 4763 scope.go:117] "RemoveContainer" containerID="a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e" Oct 06 16:48:19 crc kubenswrapper[4763]: E1006 16:48:19.359245 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e\": container with ID starting with a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e not found: ID does not exist" containerID="a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.359321 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e"} err="failed to get container status \"a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e\": rpc error: code = NotFound desc = could not find container \"a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e\": container with ID starting with a40199aeb828f97651f6d9eb4babff057fcd4d996cc02f7a8e2ca47cee3d903e not found: ID does not exist" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.359362 4763 scope.go:117] "RemoveContainer" containerID="966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2" Oct 06 16:48:19 crc kubenswrapper[4763]: E1006 16:48:19.359760 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2\": container with ID starting with 966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2 not found: ID does not exist" containerID="966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.359792 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2"} err="failed to get container status \"966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2\": rpc error: code = NotFound desc = could not find container \"966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2\": container with ID starting with 966a3dfb267ed4328f56b32d0afa2c152eebc9a3c4344a5db5ff8099fc8f26f2 not found: ID does not exist" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.359812 4763 scope.go:117] "RemoveContainer" containerID="80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d" Oct 06 16:48:19 crc kubenswrapper[4763]: E1006 16:48:19.360288 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d\": container with ID starting with 80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d not found: ID does not exist" containerID="80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.360315 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d"} err="failed to get container status \"80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d\": rpc error: code = NotFound desc = could not find container \"80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d\": container with ID starting with 80039ab27de924c0fc8fbd9812bba4f8ca6a342c34736e369140510a557c296d not found: ID does not exist" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.408726 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkdgz\" (UniqueName: \"kubernetes.io/projected/1299667d-97f7-4c2a-9d18-b48021550313-kube-api-access-pkdgz\") pod \"1299667d-97f7-4c2a-9d18-b48021550313\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.408840 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-catalog-content\") pod \"1299667d-97f7-4c2a-9d18-b48021550313\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.408940 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-utilities\") pod \"1299667d-97f7-4c2a-9d18-b48021550313\" (UID: \"1299667d-97f7-4c2a-9d18-b48021550313\") " Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.410284 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-utilities" (OuterVolumeSpecName: "utilities") pod "1299667d-97f7-4c2a-9d18-b48021550313" (UID: "1299667d-97f7-4c2a-9d18-b48021550313"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.419444 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1299667d-97f7-4c2a-9d18-b48021550313-kube-api-access-pkdgz" (OuterVolumeSpecName: "kube-api-access-pkdgz") pod "1299667d-97f7-4c2a-9d18-b48021550313" (UID: "1299667d-97f7-4c2a-9d18-b48021550313"). InnerVolumeSpecName "kube-api-access-pkdgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.507002 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1299667d-97f7-4c2a-9d18-b48021550313" (UID: "1299667d-97f7-4c2a-9d18-b48021550313"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.511220 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkdgz\" (UniqueName: \"kubernetes.io/projected/1299667d-97f7-4c2a-9d18-b48021550313-kube-api-access-pkdgz\") on node \"crc\" DevicePath \"\"" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.511255 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.511267 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1299667d-97f7-4c2a-9d18-b48021550313-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:48:19 crc kubenswrapper[4763]: I1006 16:48:19.590138 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce7741a-ee57-41fe-9c36-30562b802903" path="/var/lib/kubelet/pods/3ce7741a-ee57-41fe-9c36-30562b802903/volumes" Oct 06 16:48:20 crc kubenswrapper[4763]: I1006 16:48:20.295748 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hw7g" event={"ID":"1299667d-97f7-4c2a-9d18-b48021550313","Type":"ContainerDied","Data":"c63eae3bf38761b1d3baefe64147089d8fb60cf0f8ee0c910e2058faa93499d2"} Oct 06 16:48:20 crc kubenswrapper[4763]: I1006 16:48:20.295780 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hw7g" Oct 06 16:48:20 crc kubenswrapper[4763]: I1006 16:48:20.296087 4763 scope.go:117] "RemoveContainer" containerID="928b0db79c9254550f98117514cc6d448238f560e88a4828d47beb05aa738869" Oct 06 16:48:20 crc kubenswrapper[4763]: I1006 16:48:20.340346 4763 scope.go:117] "RemoveContainer" containerID="7341ed2730f0922628ee259c66d5fd670c30ec8295ad19ae6e26b2afdd03679c" Oct 06 16:48:20 crc kubenswrapper[4763]: I1006 16:48:20.350929 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9hw7g"] Oct 06 16:48:20 crc kubenswrapper[4763]: I1006 16:48:20.363639 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9hw7g"] Oct 06 16:48:20 crc kubenswrapper[4763]: I1006 16:48:20.382427 4763 scope.go:117] "RemoveContainer" containerID="ad0dd28bb3fe3cb30a4582d883753c9090f6e8b9d102a263ac80840e508d0e39" Oct 06 16:48:21 crc kubenswrapper[4763]: I1006 16:48:21.630562 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1299667d-97f7-4c2a-9d18-b48021550313" path="/var/lib/kubelet/pods/1299667d-97f7-4c2a-9d18-b48021550313/volumes" Oct 06 16:48:33 crc kubenswrapper[4763]: I1006 16:48:33.583715 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:48:33 crc kubenswrapper[4763]: E1006 16:48:33.584455 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:48:45 crc kubenswrapper[4763]: I1006 16:48:45.576150 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:48:45 crc kubenswrapper[4763]: E1006 16:48:45.577549 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:48:57 crc kubenswrapper[4763]: I1006 16:48:57.575855 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:48:57 crc kubenswrapper[4763]: E1006 16:48:57.577098 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:49:12 crc kubenswrapper[4763]: I1006 16:49:12.575304 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:49:12 crc kubenswrapper[4763]: E1006 16:49:12.576071 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:49:23 crc kubenswrapper[4763]: I1006 16:49:23.588312 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:49:23 crc kubenswrapper[4763]: E1006 16:49:23.590411 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:49:37 crc kubenswrapper[4763]: I1006 16:49:37.575578 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:49:37 crc kubenswrapper[4763]: E1006 16:49:37.576610 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:49:48 crc kubenswrapper[4763]: I1006 16:49:48.575279 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:49:48 crc kubenswrapper[4763]: E1006 16:49:48.575976 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:50:01 crc kubenswrapper[4763]: I1006 16:50:01.574931 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:50:01 crc kubenswrapper[4763]: E1006 16:50:01.575763 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:50:15 crc kubenswrapper[4763]: I1006 16:50:15.576073 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:50:15 crc kubenswrapper[4763]: E1006 16:50:15.577508 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:50:29 crc kubenswrapper[4763]: I1006 16:50:29.580111 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:50:29 crc kubenswrapper[4763]: E1006 16:50:29.583972 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:50:42 crc kubenswrapper[4763]: I1006 16:50:42.575847 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:50:42 crc kubenswrapper[4763]: I1006 16:50:42.911853 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"8bd0b4f0af8cef08440d9304344daf5a7435d9758e8cb5e9e86173ab21e73933"} Oct 06 16:53:03 crc kubenswrapper[4763]: I1006 16:53:03.877065 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:53:03 crc kubenswrapper[4763]: I1006 16:53:03.877711 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.986505 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxnnp"] Oct 06 16:53:04 crc kubenswrapper[4763]: E1006 16:53:03.987011 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1299667d-97f7-4c2a-9d18-b48021550313" containerName="extract-utilities" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.987026 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1299667d-97f7-4c2a-9d18-b48021550313" containerName="extract-utilities" Oct 06 16:53:04 crc kubenswrapper[4763]: E1006 16:53:03.987043 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1299667d-97f7-4c2a-9d18-b48021550313" containerName="extract-content" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.987051 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1299667d-97f7-4c2a-9d18-b48021550313" containerName="extract-content" Oct 06 16:53:04 crc kubenswrapper[4763]: E1006 16:53:03.987071 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce7741a-ee57-41fe-9c36-30562b802903" containerName="extract-utilities" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.987080 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce7741a-ee57-41fe-9c36-30562b802903" containerName="extract-utilities" Oct 06 16:53:04 crc kubenswrapper[4763]: E1006 16:53:03.987102 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1299667d-97f7-4c2a-9d18-b48021550313" containerName="registry-server" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.987109 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1299667d-97f7-4c2a-9d18-b48021550313" containerName="registry-server" Oct 06 16:53:04 crc kubenswrapper[4763]: E1006 16:53:03.987127 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce7741a-ee57-41fe-9c36-30562b802903" containerName="registry-server" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.987134 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce7741a-ee57-41fe-9c36-30562b802903" containerName="registry-server" Oct 06 16:53:04 crc kubenswrapper[4763]: E1006 16:53:03.987164 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce7741a-ee57-41fe-9c36-30562b802903" containerName="extract-content" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.987173 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce7741a-ee57-41fe-9c36-30562b802903" containerName="extract-content" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.987443 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1299667d-97f7-4c2a-9d18-b48021550313" containerName="registry-server" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.987481 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce7741a-ee57-41fe-9c36-30562b802903" containerName="registry-server" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.989809 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:03.991352 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxnnp"] Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.122394 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-utilities\") pod \"community-operators-mxnnp\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.122502 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-catalog-content\") pod \"community-operators-mxnnp\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.122558 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sd6l\" (UniqueName: \"kubernetes.io/projected/312eabf6-009b-48cc-9380-a33830fb674a-kube-api-access-2sd6l\") pod \"community-operators-mxnnp\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.224596 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-utilities\") pod \"community-operators-mxnnp\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.224721 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-catalog-content\") pod \"community-operators-mxnnp\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.224784 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sd6l\" (UniqueName: \"kubernetes.io/projected/312eabf6-009b-48cc-9380-a33830fb674a-kube-api-access-2sd6l\") pod \"community-operators-mxnnp\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.225428 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-utilities\") pod \"community-operators-mxnnp\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.225498 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-catalog-content\") pod \"community-operators-mxnnp\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.245701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sd6l\" (UniqueName: \"kubernetes.io/projected/312eabf6-009b-48cc-9380-a33830fb674a-kube-api-access-2sd6l\") pod \"community-operators-mxnnp\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.346335 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:04 crc kubenswrapper[4763]: I1006 16:53:04.918895 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxnnp"] Oct 06 16:53:05 crc kubenswrapper[4763]: I1006 16:53:05.668422 4763 generic.go:334] "Generic (PLEG): container finished" podID="312eabf6-009b-48cc-9380-a33830fb674a" containerID="44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4" exitCode=0 Oct 06 16:53:05 crc kubenswrapper[4763]: I1006 16:53:05.668556 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnnp" event={"ID":"312eabf6-009b-48cc-9380-a33830fb674a","Type":"ContainerDied","Data":"44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4"} Oct 06 16:53:05 crc kubenswrapper[4763]: I1006 16:53:05.668893 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnnp" event={"ID":"312eabf6-009b-48cc-9380-a33830fb674a","Type":"ContainerStarted","Data":"08094343c2e05e0b45952e7c75067308265fe5d7b21b9731b8c2b1c735114820"} Oct 06 16:53:05 crc kubenswrapper[4763]: I1006 16:53:05.673107 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 16:53:07 crc kubenswrapper[4763]: I1006 16:53:07.702973 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnnp" event={"ID":"312eabf6-009b-48cc-9380-a33830fb674a","Type":"ContainerStarted","Data":"72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba"} Oct 06 16:53:08 crc kubenswrapper[4763]: I1006 16:53:08.720642 4763 generic.go:334] "Generic (PLEG): container finished" podID="312eabf6-009b-48cc-9380-a33830fb674a" containerID="72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba" exitCode=0 Oct 06 16:53:08 crc kubenswrapper[4763]: I1006 16:53:08.720684 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnnp" event={"ID":"312eabf6-009b-48cc-9380-a33830fb674a","Type":"ContainerDied","Data":"72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba"} Oct 06 16:53:10 crc kubenswrapper[4763]: I1006 16:53:10.765021 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnnp" event={"ID":"312eabf6-009b-48cc-9380-a33830fb674a","Type":"ContainerStarted","Data":"9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9"} Oct 06 16:53:10 crc kubenswrapper[4763]: I1006 16:53:10.790925 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxnnp" podStartSLOduration=3.898638974 podStartE2EDuration="7.790893676s" podCreationTimestamp="2025-10-06 16:53:03 +0000 UTC" firstStartedPulling="2025-10-06 16:53:05.672770424 +0000 UTC m=+7182.828062946" lastFinishedPulling="2025-10-06 16:53:09.565025136 +0000 UTC m=+7186.720317648" observedRunningTime="2025-10-06 16:53:10.783893746 +0000 UTC m=+7187.939186288" watchObservedRunningTime="2025-10-06 16:53:10.790893676 +0000 UTC m=+7187.946186198" Oct 06 16:53:14 crc kubenswrapper[4763]: I1006 16:53:14.347372 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:14 crc kubenswrapper[4763]: I1006 16:53:14.348187 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:14 crc kubenswrapper[4763]: I1006 16:53:14.403912 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:14 crc kubenswrapper[4763]: I1006 16:53:14.893809 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:14 crc kubenswrapper[4763]: I1006 16:53:14.963861 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxnnp"] Oct 06 16:53:16 crc kubenswrapper[4763]: I1006 16:53:16.835527 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxnnp" podUID="312eabf6-009b-48cc-9380-a33830fb674a" containerName="registry-server" containerID="cri-o://9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9" gracePeriod=2 Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.390179 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.565076 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-utilities\") pod \"312eabf6-009b-48cc-9380-a33830fb674a\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.565840 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sd6l\" (UniqueName: \"kubernetes.io/projected/312eabf6-009b-48cc-9380-a33830fb674a-kube-api-access-2sd6l\") pod \"312eabf6-009b-48cc-9380-a33830fb674a\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.566102 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-utilities" (OuterVolumeSpecName: "utilities") pod "312eabf6-009b-48cc-9380-a33830fb674a" (UID: "312eabf6-009b-48cc-9380-a33830fb674a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.566128 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-catalog-content\") pod \"312eabf6-009b-48cc-9380-a33830fb674a\" (UID: \"312eabf6-009b-48cc-9380-a33830fb674a\") " Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.567974 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.576187 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312eabf6-009b-48cc-9380-a33830fb674a-kube-api-access-2sd6l" (OuterVolumeSpecName: "kube-api-access-2sd6l") pod "312eabf6-009b-48cc-9380-a33830fb674a" (UID: "312eabf6-009b-48cc-9380-a33830fb674a"). InnerVolumeSpecName "kube-api-access-2sd6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.671630 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sd6l\" (UniqueName: \"kubernetes.io/projected/312eabf6-009b-48cc-9380-a33830fb674a-kube-api-access-2sd6l\") on node \"crc\" DevicePath \"\"" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.681721 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "312eabf6-009b-48cc-9380-a33830fb674a" (UID: "312eabf6-009b-48cc-9380-a33830fb674a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.775567 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312eabf6-009b-48cc-9380-a33830fb674a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.846953 4763 generic.go:334] "Generic (PLEG): container finished" podID="312eabf6-009b-48cc-9380-a33830fb674a" containerID="9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9" exitCode=0 Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.846995 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnnp" event={"ID":"312eabf6-009b-48cc-9380-a33830fb674a","Type":"ContainerDied","Data":"9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9"} Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.847021 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnnp" event={"ID":"312eabf6-009b-48cc-9380-a33830fb674a","Type":"ContainerDied","Data":"08094343c2e05e0b45952e7c75067308265fe5d7b21b9731b8c2b1c735114820"} Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.847027 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxnnp" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.847038 4763 scope.go:117] "RemoveContainer" containerID="9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.885315 4763 scope.go:117] "RemoveContainer" containerID="72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.889853 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxnnp"] Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.904200 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxnnp"] Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.917878 4763 scope.go:117] "RemoveContainer" containerID="44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.957174 4763 scope.go:117] "RemoveContainer" containerID="9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9" Oct 06 16:53:17 crc kubenswrapper[4763]: E1006 16:53:17.962165 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9\": container with ID starting with 9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9 not found: ID does not exist" containerID="9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.962201 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9"} err="failed to get container status \"9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9\": rpc error: code = NotFound desc = could not find container \"9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9\": container with ID starting with 9a625cb714e3c720f1b280fcadc68e7856637028b919e0ee5c603b8db9d064f9 not found: ID does not exist" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.962225 4763 scope.go:117] "RemoveContainer" containerID="72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba" Oct 06 16:53:17 crc kubenswrapper[4763]: E1006 16:53:17.962565 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba\": container with ID starting with 72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba not found: ID does not exist" containerID="72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.962592 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba"} err="failed to get container status \"72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba\": rpc error: code = NotFound desc = could not find container \"72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba\": container with ID starting with 72edba919ffa4d5653fb785fdc103d1ec058e5cc7b57aaf7422bdc3425064bba not found: ID does not exist" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.962626 4763 scope.go:117] "RemoveContainer" containerID="44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4" Oct 06 16:53:17 crc kubenswrapper[4763]: E1006 16:53:17.962882 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4\": container with ID starting with 44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4 not found: ID does not exist" containerID="44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4" Oct 06 16:53:17 crc kubenswrapper[4763]: I1006 16:53:17.962908 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4"} err="failed to get container status \"44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4\": rpc error: code = NotFound desc = could not find container \"44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4\": container with ID starting with 44ecfdcac81f96d8e0cf4d50515d633b76a28f6349acc84f930fa01669a72bc4 not found: ID does not exist" Oct 06 16:53:19 crc kubenswrapper[4763]: I1006 16:53:19.596061 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="312eabf6-009b-48cc-9380-a33830fb674a" path="/var/lib/kubelet/pods/312eabf6-009b-48cc-9380-a33830fb674a/volumes" Oct 06 16:53:33 crc kubenswrapper[4763]: I1006 16:53:33.876837 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:53:33 crc kubenswrapper[4763]: I1006 16:53:33.879075 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:54:03 crc kubenswrapper[4763]: I1006 16:54:03.876327 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:54:03 crc kubenswrapper[4763]: I1006 16:54:03.877038 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:54:03 crc kubenswrapper[4763]: I1006 16:54:03.877104 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 16:54:03 crc kubenswrapper[4763]: I1006 16:54:03.878361 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bd0b4f0af8cef08440d9304344daf5a7435d9758e8cb5e9e86173ab21e73933"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 16:54:03 crc kubenswrapper[4763]: I1006 16:54:03.878491 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://8bd0b4f0af8cef08440d9304344daf5a7435d9758e8cb5e9e86173ab21e73933" gracePeriod=600 Oct 06 16:54:04 crc kubenswrapper[4763]: I1006 16:54:04.440215 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="8bd0b4f0af8cef08440d9304344daf5a7435d9758e8cb5e9e86173ab21e73933" exitCode=0 Oct 06 16:54:04 crc kubenswrapper[4763]: I1006 16:54:04.440295 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"8bd0b4f0af8cef08440d9304344daf5a7435d9758e8cb5e9e86173ab21e73933"} Oct 06 16:54:04 crc kubenswrapper[4763]: I1006 16:54:04.440500 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed"} Oct 06 16:54:04 crc kubenswrapper[4763]: I1006 16:54:04.440526 4763 scope.go:117] "RemoveContainer" containerID="ea885e8dcc3c5528b2153189bf0559c506912be8c34d9ea8155d28d175f9ec22" Oct 06 16:56:33 crc kubenswrapper[4763]: I1006 16:56:33.877204 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:56:33 crc kubenswrapper[4763]: I1006 16:56:33.878029 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:57:03 crc kubenswrapper[4763]: I1006 16:57:03.876888 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:57:03 crc kubenswrapper[4763]: I1006 16:57:03.877720 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:57:33 crc kubenswrapper[4763]: I1006 16:57:33.877223 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 16:57:33 crc kubenswrapper[4763]: I1006 16:57:33.877911 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 16:57:33 crc kubenswrapper[4763]: I1006 16:57:33.877978 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 16:57:33 crc kubenswrapper[4763]: I1006 16:57:33.878999 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 16:57:33 crc kubenswrapper[4763]: I1006 16:57:33.879118 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" gracePeriod=600 Oct 06 16:57:34 crc kubenswrapper[4763]: E1006 16:57:34.018280 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:57:34 crc kubenswrapper[4763]: I1006 16:57:34.860388 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" exitCode=0 Oct 06 16:57:34 crc kubenswrapper[4763]: I1006 16:57:34.860470 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed"} Oct 06 16:57:34 crc kubenswrapper[4763]: I1006 16:57:34.860914 4763 scope.go:117] "RemoveContainer" containerID="8bd0b4f0af8cef08440d9304344daf5a7435d9758e8cb5e9e86173ab21e73933" Oct 06 16:57:34 crc kubenswrapper[4763]: I1006 16:57:34.861843 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:57:34 crc kubenswrapper[4763]: E1006 16:57:34.862345 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:57:49 crc kubenswrapper[4763]: I1006 16:57:49.575763 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:57:49 crc kubenswrapper[4763]: E1006 16:57:49.577058 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:58:03 crc kubenswrapper[4763]: I1006 16:58:03.588814 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:58:03 crc kubenswrapper[4763]: E1006 16:58:03.589833 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.526865 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vlq5d"] Oct 06 16:58:04 crc kubenswrapper[4763]: E1006 16:58:04.527882 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312eabf6-009b-48cc-9380-a33830fb674a" containerName="extract-content" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.527912 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="312eabf6-009b-48cc-9380-a33830fb674a" containerName="extract-content" Oct 06 16:58:04 crc kubenswrapper[4763]: E1006 16:58:04.527983 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312eabf6-009b-48cc-9380-a33830fb674a" containerName="extract-utilities" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.527998 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="312eabf6-009b-48cc-9380-a33830fb674a" containerName="extract-utilities" Oct 06 16:58:04 crc kubenswrapper[4763]: E1006 16:58:04.528020 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312eabf6-009b-48cc-9380-a33830fb674a" containerName="registry-server" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.528035 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="312eabf6-009b-48cc-9380-a33830fb674a" containerName="registry-server" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.528407 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="312eabf6-009b-48cc-9380-a33830fb674a" containerName="registry-server" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.534693 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.546727 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vlq5d"] Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.606853 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-utilities\") pod \"redhat-operators-vlq5d\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.607558 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-catalog-content\") pod \"redhat-operators-vlq5d\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.607808 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9d9t\" (UniqueName: \"kubernetes.io/projected/b173a0ee-9ce7-403b-aebe-b494dd947428-kube-api-access-n9d9t\") pod \"redhat-operators-vlq5d\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.709496 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-utilities\") pod \"redhat-operators-vlq5d\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.709695 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-catalog-content\") pod \"redhat-operators-vlq5d\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.709851 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9d9t\" (UniqueName: \"kubernetes.io/projected/b173a0ee-9ce7-403b-aebe-b494dd947428-kube-api-access-n9d9t\") pod \"redhat-operators-vlq5d\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.710216 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-utilities\") pod \"redhat-operators-vlq5d\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.711040 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-catalog-content\") pod \"redhat-operators-vlq5d\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.739801 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9d9t\" (UniqueName: \"kubernetes.io/projected/b173a0ee-9ce7-403b-aebe-b494dd947428-kube-api-access-n9d9t\") pod \"redhat-operators-vlq5d\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:04 crc kubenswrapper[4763]: I1006 16:58:04.862873 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:05 crc kubenswrapper[4763]: I1006 16:58:05.358824 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vlq5d"] Oct 06 16:58:06 crc kubenswrapper[4763]: I1006 16:58:06.242232 4763 generic.go:334] "Generic (PLEG): container finished" podID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerID="d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f" exitCode=0 Oct 06 16:58:06 crc kubenswrapper[4763]: I1006 16:58:06.242338 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlq5d" event={"ID":"b173a0ee-9ce7-403b-aebe-b494dd947428","Type":"ContainerDied","Data":"d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f"} Oct 06 16:58:06 crc kubenswrapper[4763]: I1006 16:58:06.242535 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlq5d" event={"ID":"b173a0ee-9ce7-403b-aebe-b494dd947428","Type":"ContainerStarted","Data":"712911e61e13e7c1a89b7501334943ad3abc4dcaa98f8714703c59f7fdc32d89"} Oct 06 16:58:06 crc kubenswrapper[4763]: I1006 16:58:06.246162 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 16:58:08 crc kubenswrapper[4763]: I1006 16:58:08.267431 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlq5d" event={"ID":"b173a0ee-9ce7-403b-aebe-b494dd947428","Type":"ContainerStarted","Data":"218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43"} Oct 06 16:58:11 crc kubenswrapper[4763]: I1006 16:58:11.303569 4763 generic.go:334] "Generic (PLEG): container finished" podID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerID="218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43" exitCode=0 Oct 06 16:58:11 crc kubenswrapper[4763]: I1006 16:58:11.303701 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlq5d" event={"ID":"b173a0ee-9ce7-403b-aebe-b494dd947428","Type":"ContainerDied","Data":"218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43"} Oct 06 16:58:12 crc kubenswrapper[4763]: I1006 16:58:12.313736 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlq5d" event={"ID":"b173a0ee-9ce7-403b-aebe-b494dd947428","Type":"ContainerStarted","Data":"e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52"} Oct 06 16:58:12 crc kubenswrapper[4763]: I1006 16:58:12.338215 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vlq5d" podStartSLOduration=2.849950753 podStartE2EDuration="8.33819372s" podCreationTimestamp="2025-10-06 16:58:04 +0000 UTC" firstStartedPulling="2025-10-06 16:58:06.245881919 +0000 UTC m=+7483.401174441" lastFinishedPulling="2025-10-06 16:58:11.734124896 +0000 UTC m=+7488.889417408" observedRunningTime="2025-10-06 16:58:12.333902573 +0000 UTC m=+7489.489195125" watchObservedRunningTime="2025-10-06 16:58:12.33819372 +0000 UTC m=+7489.493486232" Oct 06 16:58:14 crc kubenswrapper[4763]: I1006 16:58:14.863592 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:14 crc kubenswrapper[4763]: I1006 16:58:14.865810 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:15 crc kubenswrapper[4763]: I1006 16:58:15.987927 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vlq5d" podUID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerName="registry-server" probeResult="failure" output=< Oct 06 16:58:15 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Oct 06 16:58:15 crc kubenswrapper[4763]: > Oct 06 16:58:18 crc kubenswrapper[4763]: I1006 16:58:18.574824 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:58:18 crc kubenswrapper[4763]: E1006 16:58:18.575725 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:58:24 crc kubenswrapper[4763]: I1006 16:58:24.943104 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:25 crc kubenswrapper[4763]: I1006 16:58:25.006776 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:25 crc kubenswrapper[4763]: I1006 16:58:25.190007 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vlq5d"] Oct 06 16:58:26 crc kubenswrapper[4763]: I1006 16:58:26.494843 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vlq5d" podUID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerName="registry-server" containerID="cri-o://e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52" gracePeriod=2 Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.088343 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.116180 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-catalog-content\") pod \"b173a0ee-9ce7-403b-aebe-b494dd947428\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.116411 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-utilities\") pod \"b173a0ee-9ce7-403b-aebe-b494dd947428\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.116450 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9d9t\" (UniqueName: \"kubernetes.io/projected/b173a0ee-9ce7-403b-aebe-b494dd947428-kube-api-access-n9d9t\") pod \"b173a0ee-9ce7-403b-aebe-b494dd947428\" (UID: \"b173a0ee-9ce7-403b-aebe-b494dd947428\") " Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.117189 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-utilities" (OuterVolumeSpecName: "utilities") pod "b173a0ee-9ce7-403b-aebe-b494dd947428" (UID: "b173a0ee-9ce7-403b-aebe-b494dd947428"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.124077 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b173a0ee-9ce7-403b-aebe-b494dd947428-kube-api-access-n9d9t" (OuterVolumeSpecName: "kube-api-access-n9d9t") pod "b173a0ee-9ce7-403b-aebe-b494dd947428" (UID: "b173a0ee-9ce7-403b-aebe-b494dd947428"). InnerVolumeSpecName "kube-api-access-n9d9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.219471 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.219532 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9d9t\" (UniqueName: \"kubernetes.io/projected/b173a0ee-9ce7-403b-aebe-b494dd947428-kube-api-access-n9d9t\") on node \"crc\" DevicePath \"\"" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.219664 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b173a0ee-9ce7-403b-aebe-b494dd947428" (UID: "b173a0ee-9ce7-403b-aebe-b494dd947428"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.321567 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b173a0ee-9ce7-403b-aebe-b494dd947428-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.509913 4763 generic.go:334] "Generic (PLEG): container finished" podID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerID="e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52" exitCode=0 Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.509959 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlq5d" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.509987 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlq5d" event={"ID":"b173a0ee-9ce7-403b-aebe-b494dd947428","Type":"ContainerDied","Data":"e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52"} Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.510038 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlq5d" event={"ID":"b173a0ee-9ce7-403b-aebe-b494dd947428","Type":"ContainerDied","Data":"712911e61e13e7c1a89b7501334943ad3abc4dcaa98f8714703c59f7fdc32d89"} Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.510061 4763 scope.go:117] "RemoveContainer" containerID="e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.565989 4763 scope.go:117] "RemoveContainer" containerID="218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.594754 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vlq5d"] Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.595201 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vlq5d"] Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.611380 4763 scope.go:117] "RemoveContainer" containerID="d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.656099 4763 scope.go:117] "RemoveContainer" containerID="e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52" Oct 06 16:58:27 crc kubenswrapper[4763]: E1006 16:58:27.656928 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52\": container with ID starting with e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52 not found: ID does not exist" containerID="e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.657095 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52"} err="failed to get container status \"e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52\": rpc error: code = NotFound desc = could not find container \"e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52\": container with ID starting with e28fbe166cd88b053893a363aff58ff4f1d7e9efbcc826fe349b3067a5899c52 not found: ID does not exist" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.657134 4763 scope.go:117] "RemoveContainer" containerID="218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43" Oct 06 16:58:27 crc kubenswrapper[4763]: E1006 16:58:27.657501 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43\": container with ID starting with 218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43 not found: ID does not exist" containerID="218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.657530 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43"} err="failed to get container status \"218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43\": rpc error: code = NotFound desc = could not find container \"218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43\": container with ID starting with 218cc22eeb318541a6a028113341d29f326aea476a1dfe58938032759e385b43 not found: ID does not exist" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.657548 4763 scope.go:117] "RemoveContainer" containerID="d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f" Oct 06 16:58:27 crc kubenswrapper[4763]: E1006 16:58:27.657999 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f\": container with ID starting with d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f not found: ID does not exist" containerID="d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f" Oct 06 16:58:27 crc kubenswrapper[4763]: I1006 16:58:27.658026 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f"} err="failed to get container status \"d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f\": rpc error: code = NotFound desc = could not find container \"d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f\": container with ID starting with d0882a968c270ec515073e4146f8ea4aabc3f596a59245a91242040054f5217f not found: ID does not exist" Oct 06 16:58:29 crc kubenswrapper[4763]: I1006 16:58:29.577186 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:58:29 crc kubenswrapper[4763]: E1006 16:58:29.577893 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:58:29 crc kubenswrapper[4763]: I1006 16:58:29.591537 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b173a0ee-9ce7-403b-aebe-b494dd947428" path="/var/lib/kubelet/pods/b173a0ee-9ce7-403b-aebe-b494dd947428/volumes" Oct 06 16:58:44 crc kubenswrapper[4763]: I1006 16:58:44.576222 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:58:44 crc kubenswrapper[4763]: E1006 16:58:44.577429 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:58:56 crc kubenswrapper[4763]: I1006 16:58:56.575458 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:58:56 crc kubenswrapper[4763]: E1006 16:58:56.576859 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:59:07 crc kubenswrapper[4763]: I1006 16:59:07.574954 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:59:07 crc kubenswrapper[4763]: E1006 16:59:07.577176 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:59:20 crc kubenswrapper[4763]: I1006 16:59:20.575258 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:59:20 crc kubenswrapper[4763]: E1006 16:59:20.576248 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:59:24 crc kubenswrapper[4763]: I1006 16:59:24.939568 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c8cll"] Oct 06 16:59:24 crc kubenswrapper[4763]: E1006 16:59:24.940755 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerName="extract-content" Oct 06 16:59:24 crc kubenswrapper[4763]: I1006 16:59:24.940777 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerName="extract-content" Oct 06 16:59:24 crc kubenswrapper[4763]: E1006 16:59:24.940806 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerName="registry-server" Oct 06 16:59:24 crc kubenswrapper[4763]: I1006 16:59:24.940815 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerName="registry-server" Oct 06 16:59:24 crc kubenswrapper[4763]: E1006 16:59:24.940842 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerName="extract-utilities" Oct 06 16:59:24 crc kubenswrapper[4763]: I1006 16:59:24.940852 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerName="extract-utilities" Oct 06 16:59:24 crc kubenswrapper[4763]: I1006 16:59:24.941159 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b173a0ee-9ce7-403b-aebe-b494dd947428" containerName="registry-server" Oct 06 16:59:24 crc kubenswrapper[4763]: I1006 16:59:24.943344 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:24 crc kubenswrapper[4763]: I1006 16:59:24.952867 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8cll"] Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.109825 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-utilities\") pod \"certified-operators-c8cll\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.109933 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6w95\" (UniqueName: \"kubernetes.io/projected/c57ce4be-e7c1-47cb-b294-0835da769452-kube-api-access-b6w95\") pod \"certified-operators-c8cll\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.110203 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-catalog-content\") pod \"certified-operators-c8cll\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.212495 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-utilities\") pod \"certified-operators-c8cll\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.212947 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6w95\" (UniqueName: \"kubernetes.io/projected/c57ce4be-e7c1-47cb-b294-0835da769452-kube-api-access-b6w95\") pod \"certified-operators-c8cll\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.213119 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-catalog-content\") pod \"certified-operators-c8cll\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.213165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-utilities\") pod \"certified-operators-c8cll\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.213487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-catalog-content\") pod \"certified-operators-c8cll\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.240438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6w95\" (UniqueName: \"kubernetes.io/projected/c57ce4be-e7c1-47cb-b294-0835da769452-kube-api-access-b6w95\") pod \"certified-operators-c8cll\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.298827 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:25 crc kubenswrapper[4763]: I1006 16:59:25.829427 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8cll"] Oct 06 16:59:26 crc kubenswrapper[4763]: I1006 16:59:26.246355 4763 generic.go:334] "Generic (PLEG): container finished" podID="c57ce4be-e7c1-47cb-b294-0835da769452" containerID="aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7" exitCode=0 Oct 06 16:59:26 crc kubenswrapper[4763]: I1006 16:59:26.246480 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8cll" event={"ID":"c57ce4be-e7c1-47cb-b294-0835da769452","Type":"ContainerDied","Data":"aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7"} Oct 06 16:59:26 crc kubenswrapper[4763]: I1006 16:59:26.246890 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8cll" event={"ID":"c57ce4be-e7c1-47cb-b294-0835da769452","Type":"ContainerStarted","Data":"1b7e61edd11f8e954cda94b1d0eb3365d69c1c889bccd0509020338ec003d5f8"} Oct 06 16:59:28 crc kubenswrapper[4763]: I1006 16:59:28.269547 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8cll" event={"ID":"c57ce4be-e7c1-47cb-b294-0835da769452","Type":"ContainerStarted","Data":"65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc"} Oct 06 16:59:29 crc kubenswrapper[4763]: I1006 16:59:29.289382 4763 generic.go:334] "Generic (PLEG): container finished" podID="c57ce4be-e7c1-47cb-b294-0835da769452" containerID="65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc" exitCode=0 Oct 06 16:59:29 crc kubenswrapper[4763]: I1006 16:59:29.289484 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8cll" event={"ID":"c57ce4be-e7c1-47cb-b294-0835da769452","Type":"ContainerDied","Data":"65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc"} Oct 06 16:59:30 crc kubenswrapper[4763]: I1006 16:59:30.317922 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8cll" event={"ID":"c57ce4be-e7c1-47cb-b294-0835da769452","Type":"ContainerStarted","Data":"30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28"} Oct 06 16:59:30 crc kubenswrapper[4763]: I1006 16:59:30.347555 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c8cll" podStartSLOduration=2.79451917 podStartE2EDuration="6.347530718s" podCreationTimestamp="2025-10-06 16:59:24 +0000 UTC" firstStartedPulling="2025-10-06 16:59:26.249190586 +0000 UTC m=+7563.404483118" lastFinishedPulling="2025-10-06 16:59:29.802202154 +0000 UTC m=+7566.957494666" observedRunningTime="2025-10-06 16:59:30.340132287 +0000 UTC m=+7567.495424809" watchObservedRunningTime="2025-10-06 16:59:30.347530718 +0000 UTC m=+7567.502823230" Oct 06 16:59:31 crc kubenswrapper[4763]: I1006 16:59:31.575571 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:59:31 crc kubenswrapper[4763]: E1006 16:59:31.576390 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:59:35 crc kubenswrapper[4763]: I1006 16:59:35.299149 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:35 crc kubenswrapper[4763]: I1006 16:59:35.299883 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:35 crc kubenswrapper[4763]: I1006 16:59:35.359457 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:35 crc kubenswrapper[4763]: I1006 16:59:35.436448 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:35 crc kubenswrapper[4763]: I1006 16:59:35.607801 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8cll"] Oct 06 16:59:37 crc kubenswrapper[4763]: I1006 16:59:37.400662 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c8cll" podUID="c57ce4be-e7c1-47cb-b294-0835da769452" containerName="registry-server" containerID="cri-o://30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28" gracePeriod=2 Oct 06 16:59:37 crc kubenswrapper[4763]: I1006 16:59:37.990275 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.140431 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-utilities\") pod \"c57ce4be-e7c1-47cb-b294-0835da769452\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.140536 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6w95\" (UniqueName: \"kubernetes.io/projected/c57ce4be-e7c1-47cb-b294-0835da769452-kube-api-access-b6w95\") pod \"c57ce4be-e7c1-47cb-b294-0835da769452\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.140869 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-catalog-content\") pod \"c57ce4be-e7c1-47cb-b294-0835da769452\" (UID: \"c57ce4be-e7c1-47cb-b294-0835da769452\") " Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.142134 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-utilities" (OuterVolumeSpecName: "utilities") pod "c57ce4be-e7c1-47cb-b294-0835da769452" (UID: "c57ce4be-e7c1-47cb-b294-0835da769452"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.149780 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57ce4be-e7c1-47cb-b294-0835da769452-kube-api-access-b6w95" (OuterVolumeSpecName: "kube-api-access-b6w95") pod "c57ce4be-e7c1-47cb-b294-0835da769452" (UID: "c57ce4be-e7c1-47cb-b294-0835da769452"). InnerVolumeSpecName "kube-api-access-b6w95". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.183491 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c57ce4be-e7c1-47cb-b294-0835da769452" (UID: "c57ce4be-e7c1-47cb-b294-0835da769452"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.243350 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.243396 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6w95\" (UniqueName: \"kubernetes.io/projected/c57ce4be-e7c1-47cb-b294-0835da769452-kube-api-access-b6w95\") on node \"crc\" DevicePath \"\"" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.243408 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c57ce4be-e7c1-47cb-b294-0835da769452-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.414359 4763 generic.go:334] "Generic (PLEG): container finished" podID="c57ce4be-e7c1-47cb-b294-0835da769452" containerID="30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28" exitCode=0 Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.414398 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8cll" event={"ID":"c57ce4be-e7c1-47cb-b294-0835da769452","Type":"ContainerDied","Data":"30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28"} Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.414423 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8cll" event={"ID":"c57ce4be-e7c1-47cb-b294-0835da769452","Type":"ContainerDied","Data":"1b7e61edd11f8e954cda94b1d0eb3365d69c1c889bccd0509020338ec003d5f8"} Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.414440 4763 scope.go:117] "RemoveContainer" containerID="30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.414503 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8cll" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.456082 4763 scope.go:117] "RemoveContainer" containerID="65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.472559 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8cll"] Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.480933 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c8cll"] Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.488738 4763 scope.go:117] "RemoveContainer" containerID="aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.556335 4763 scope.go:117] "RemoveContainer" containerID="30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28" Oct 06 16:59:38 crc kubenswrapper[4763]: E1006 16:59:38.556693 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28\": container with ID starting with 30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28 not found: ID does not exist" containerID="30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.556731 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28"} err="failed to get container status \"30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28\": rpc error: code = NotFound desc = could not find container \"30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28\": container with ID starting with 30143b5ef7c9238774c45edac6c3bbd5eaf5f2c7f5845b5df9587ddd448bce28 not found: ID does not exist" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.556759 4763 scope.go:117] "RemoveContainer" containerID="65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc" Oct 06 16:59:38 crc kubenswrapper[4763]: E1006 16:59:38.556969 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc\": container with ID starting with 65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc not found: ID does not exist" containerID="65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.557000 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc"} err="failed to get container status \"65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc\": rpc error: code = NotFound desc = could not find container \"65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc\": container with ID starting with 65f0ab977d0af7edf87595a991f550a95b4fa06996d991ae3a2dfec62a214fdc not found: ID does not exist" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.557017 4763 scope.go:117] "RemoveContainer" containerID="aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7" Oct 06 16:59:38 crc kubenswrapper[4763]: E1006 16:59:38.557468 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7\": container with ID starting with aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7 not found: ID does not exist" containerID="aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7" Oct 06 16:59:38 crc kubenswrapper[4763]: I1006 16:59:38.557496 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7"} err="failed to get container status \"aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7\": rpc error: code = NotFound desc = could not find container \"aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7\": container with ID starting with aae138104d268d69c6ce8e1de79a384ee41adb576ddfaf9498ed5a64dad321c7 not found: ID does not exist" Oct 06 16:59:39 crc kubenswrapper[4763]: I1006 16:59:39.590522 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57ce4be-e7c1-47cb-b294-0835da769452" path="/var/lib/kubelet/pods/c57ce4be-e7c1-47cb-b294-0835da769452/volumes" Oct 06 16:59:45 crc kubenswrapper[4763]: I1006 16:59:45.575607 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:59:45 crc kubenswrapper[4763]: E1006 16:59:45.576438 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.416068 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mq4lj"] Oct 06 16:59:53 crc kubenswrapper[4763]: E1006 16:59:53.427769 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57ce4be-e7c1-47cb-b294-0835da769452" containerName="extract-utilities" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.427806 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57ce4be-e7c1-47cb-b294-0835da769452" containerName="extract-utilities" Oct 06 16:59:53 crc kubenswrapper[4763]: E1006 16:59:53.427836 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57ce4be-e7c1-47cb-b294-0835da769452" containerName="registry-server" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.427846 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57ce4be-e7c1-47cb-b294-0835da769452" containerName="registry-server" Oct 06 16:59:53 crc kubenswrapper[4763]: E1006 16:59:53.427937 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57ce4be-e7c1-47cb-b294-0835da769452" containerName="extract-content" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.427947 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57ce4be-e7c1-47cb-b294-0835da769452" containerName="extract-content" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.428735 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57ce4be-e7c1-47cb-b294-0835da769452" containerName="registry-server" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.432964 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.450133 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mq4lj"] Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.514840 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-catalog-content\") pod \"redhat-marketplace-mq4lj\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.515193 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-utilities\") pod \"redhat-marketplace-mq4lj\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.515275 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7n92\" (UniqueName: \"kubernetes.io/projected/08a7c19d-6fd9-491c-a620-6063b89a0dd1-kube-api-access-r7n92\") pod \"redhat-marketplace-mq4lj\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.618329 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-utilities\") pod \"redhat-marketplace-mq4lj\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.618429 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7n92\" (UniqueName: \"kubernetes.io/projected/08a7c19d-6fd9-491c-a620-6063b89a0dd1-kube-api-access-r7n92\") pod \"redhat-marketplace-mq4lj\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.618711 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-catalog-content\") pod \"redhat-marketplace-mq4lj\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.619005 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-utilities\") pod \"redhat-marketplace-mq4lj\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.619735 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-catalog-content\") pod \"redhat-marketplace-mq4lj\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.647828 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7n92\" (UniqueName: \"kubernetes.io/projected/08a7c19d-6fd9-491c-a620-6063b89a0dd1-kube-api-access-r7n92\") pod \"redhat-marketplace-mq4lj\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:53 crc kubenswrapper[4763]: I1006 16:59:53.773031 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 16:59:54 crc kubenswrapper[4763]: I1006 16:59:54.368493 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mq4lj"] Oct 06 16:59:54 crc kubenswrapper[4763]: I1006 16:59:54.605585 4763 generic.go:334] "Generic (PLEG): container finished" podID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerID="15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28" exitCode=0 Oct 06 16:59:54 crc kubenswrapper[4763]: I1006 16:59:54.605946 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq4lj" event={"ID":"08a7c19d-6fd9-491c-a620-6063b89a0dd1","Type":"ContainerDied","Data":"15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28"} Oct 06 16:59:54 crc kubenswrapper[4763]: I1006 16:59:54.605976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq4lj" event={"ID":"08a7c19d-6fd9-491c-a620-6063b89a0dd1","Type":"ContainerStarted","Data":"884169bc152601e59caf4ff1240780e707d49a5e3d191fe18b4e96d32b240186"} Oct 06 16:59:55 crc kubenswrapper[4763]: I1006 16:59:55.623586 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq4lj" event={"ID":"08a7c19d-6fd9-491c-a620-6063b89a0dd1","Type":"ContainerStarted","Data":"f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1"} Oct 06 16:59:56 crc kubenswrapper[4763]: I1006 16:59:56.637807 4763 generic.go:334] "Generic (PLEG): container finished" podID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerID="f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1" exitCode=0 Oct 06 16:59:56 crc kubenswrapper[4763]: I1006 16:59:56.637910 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq4lj" event={"ID":"08a7c19d-6fd9-491c-a620-6063b89a0dd1","Type":"ContainerDied","Data":"f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1"} Oct 06 16:59:57 crc kubenswrapper[4763]: I1006 16:59:57.575475 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 16:59:57 crc kubenswrapper[4763]: E1006 16:59:57.576155 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 16:59:57 crc kubenswrapper[4763]: I1006 16:59:57.660254 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq4lj" event={"ID":"08a7c19d-6fd9-491c-a620-6063b89a0dd1","Type":"ContainerStarted","Data":"3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464"} Oct 06 16:59:57 crc kubenswrapper[4763]: I1006 16:59:57.711079 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mq4lj" podStartSLOduration=2.162750193 podStartE2EDuration="4.711059611s" podCreationTimestamp="2025-10-06 16:59:53 +0000 UTC" firstStartedPulling="2025-10-06 16:59:54.608022313 +0000 UTC m=+7591.763314825" lastFinishedPulling="2025-10-06 16:59:57.156331691 +0000 UTC m=+7594.311624243" observedRunningTime="2025-10-06 16:59:57.691703234 +0000 UTC m=+7594.846995746" watchObservedRunningTime="2025-10-06 16:59:57.711059611 +0000 UTC m=+7594.866352143" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.177872 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9"] Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.181816 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.184721 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.184774 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.189930 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9"] Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.274714 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnpd7\" (UniqueName: \"kubernetes.io/projected/81379a99-f3e1-4b23-9022-890539bf5f42-kube-api-access-pnpd7\") pod \"collect-profiles-29329500-jl9m9\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.275031 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81379a99-f3e1-4b23-9022-890539bf5f42-config-volume\") pod \"collect-profiles-29329500-jl9m9\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.275125 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81379a99-f3e1-4b23-9022-890539bf5f42-secret-volume\") pod \"collect-profiles-29329500-jl9m9\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.376258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81379a99-f3e1-4b23-9022-890539bf5f42-config-volume\") pod \"collect-profiles-29329500-jl9m9\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.376308 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81379a99-f3e1-4b23-9022-890539bf5f42-secret-volume\") pod \"collect-profiles-29329500-jl9m9\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.376413 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnpd7\" (UniqueName: \"kubernetes.io/projected/81379a99-f3e1-4b23-9022-890539bf5f42-kube-api-access-pnpd7\") pod \"collect-profiles-29329500-jl9m9\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.377316 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81379a99-f3e1-4b23-9022-890539bf5f42-config-volume\") pod \"collect-profiles-29329500-jl9m9\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.391136 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81379a99-f3e1-4b23-9022-890539bf5f42-secret-volume\") pod \"collect-profiles-29329500-jl9m9\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.408997 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnpd7\" (UniqueName: \"kubernetes.io/projected/81379a99-f3e1-4b23-9022-890539bf5f42-kube-api-access-pnpd7\") pod \"collect-profiles-29329500-jl9m9\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.531999 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:00 crc kubenswrapper[4763]: I1006 17:00:00.975338 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9"] Oct 06 17:00:00 crc kubenswrapper[4763]: W1006 17:00:00.980139 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81379a99_f3e1_4b23_9022_890539bf5f42.slice/crio-32e49c450f124682e1f532302f95fb087520c80aba15bffd2fcd1e3b88fc68d5 WatchSource:0}: Error finding container 32e49c450f124682e1f532302f95fb087520c80aba15bffd2fcd1e3b88fc68d5: Status 404 returned error can't find the container with id 32e49c450f124682e1f532302f95fb087520c80aba15bffd2fcd1e3b88fc68d5 Oct 06 17:00:01 crc kubenswrapper[4763]: I1006 17:00:01.708034 4763 generic.go:334] "Generic (PLEG): container finished" podID="81379a99-f3e1-4b23-9022-890539bf5f42" containerID="13fa56dc7fda6733053f11b5bb4a86fe55117cd3b23161a1dee8a1e77b5f531c" exitCode=0 Oct 06 17:00:01 crc kubenswrapper[4763]: I1006 17:00:01.708304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" event={"ID":"81379a99-f3e1-4b23-9022-890539bf5f42","Type":"ContainerDied","Data":"13fa56dc7fda6733053f11b5bb4a86fe55117cd3b23161a1dee8a1e77b5f531c"} Oct 06 17:00:01 crc kubenswrapper[4763]: I1006 17:00:01.709289 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" event={"ID":"81379a99-f3e1-4b23-9022-890539bf5f42","Type":"ContainerStarted","Data":"32e49c450f124682e1f532302f95fb087520c80aba15bffd2fcd1e3b88fc68d5"} Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.164867 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.253372 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81379a99-f3e1-4b23-9022-890539bf5f42-secret-volume\") pod \"81379a99-f3e1-4b23-9022-890539bf5f42\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.253663 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnpd7\" (UniqueName: \"kubernetes.io/projected/81379a99-f3e1-4b23-9022-890539bf5f42-kube-api-access-pnpd7\") pod \"81379a99-f3e1-4b23-9022-890539bf5f42\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.253942 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81379a99-f3e1-4b23-9022-890539bf5f42-config-volume\") pod \"81379a99-f3e1-4b23-9022-890539bf5f42\" (UID: \"81379a99-f3e1-4b23-9022-890539bf5f42\") " Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.254841 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81379a99-f3e1-4b23-9022-890539bf5f42-config-volume" (OuterVolumeSpecName: "config-volume") pod "81379a99-f3e1-4b23-9022-890539bf5f42" (UID: "81379a99-f3e1-4b23-9022-890539bf5f42"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.255885 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81379a99-f3e1-4b23-9022-890539bf5f42-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.260488 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81379a99-f3e1-4b23-9022-890539bf5f42-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "81379a99-f3e1-4b23-9022-890539bf5f42" (UID: "81379a99-f3e1-4b23-9022-890539bf5f42"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.262332 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81379a99-f3e1-4b23-9022-890539bf5f42-kube-api-access-pnpd7" (OuterVolumeSpecName: "kube-api-access-pnpd7") pod "81379a99-f3e1-4b23-9022-890539bf5f42" (UID: "81379a99-f3e1-4b23-9022-890539bf5f42"). InnerVolumeSpecName "kube-api-access-pnpd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.358072 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnpd7\" (UniqueName: \"kubernetes.io/projected/81379a99-f3e1-4b23-9022-890539bf5f42-kube-api-access-pnpd7\") on node \"crc\" DevicePath \"\"" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.358127 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81379a99-f3e1-4b23-9022-890539bf5f42-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.731918 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" event={"ID":"81379a99-f3e1-4b23-9022-890539bf5f42","Type":"ContainerDied","Data":"32e49c450f124682e1f532302f95fb087520c80aba15bffd2fcd1e3b88fc68d5"} Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.732323 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32e49c450f124682e1f532302f95fb087520c80aba15bffd2fcd1e3b88fc68d5" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.732422 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329500-jl9m9" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.773350 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.773393 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 17:00:03 crc kubenswrapper[4763]: I1006 17:00:03.845261 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 17:00:04 crc kubenswrapper[4763]: I1006 17:00:04.275585 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj"] Oct 06 17:00:04 crc kubenswrapper[4763]: I1006 17:00:04.290356 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329455-5jwhj"] Oct 06 17:00:04 crc kubenswrapper[4763]: I1006 17:00:04.805870 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 17:00:04 crc kubenswrapper[4763]: I1006 17:00:04.855118 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mq4lj"] Oct 06 17:00:05 crc kubenswrapper[4763]: I1006 17:00:05.598429 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a955b364-4571-4269-80f2-66838a8d8303" path="/var/lib/kubelet/pods/a955b364-4571-4269-80f2-66838a8d8303/volumes" Oct 06 17:00:06 crc kubenswrapper[4763]: I1006 17:00:06.769501 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mq4lj" podUID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerName="registry-server" containerID="cri-o://3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464" gracePeriod=2 Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.360232 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.455518 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7n92\" (UniqueName: \"kubernetes.io/projected/08a7c19d-6fd9-491c-a620-6063b89a0dd1-kube-api-access-r7n92\") pod \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.455653 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-catalog-content\") pod \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.455879 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-utilities\") pod \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\" (UID: \"08a7c19d-6fd9-491c-a620-6063b89a0dd1\") " Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.456850 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-utilities" (OuterVolumeSpecName: "utilities") pod "08a7c19d-6fd9-491c-a620-6063b89a0dd1" (UID: "08a7c19d-6fd9-491c-a620-6063b89a0dd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.461387 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a7c19d-6fd9-491c-a620-6063b89a0dd1-kube-api-access-r7n92" (OuterVolumeSpecName: "kube-api-access-r7n92") pod "08a7c19d-6fd9-491c-a620-6063b89a0dd1" (UID: "08a7c19d-6fd9-491c-a620-6063b89a0dd1"). InnerVolumeSpecName "kube-api-access-r7n92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.467923 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08a7c19d-6fd9-491c-a620-6063b89a0dd1" (UID: "08a7c19d-6fd9-491c-a620-6063b89a0dd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.558473 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.558766 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7n92\" (UniqueName: \"kubernetes.io/projected/08a7c19d-6fd9-491c-a620-6063b89a0dd1-kube-api-access-r7n92\") on node \"crc\" DevicePath \"\"" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.558833 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08a7c19d-6fd9-491c-a620-6063b89a0dd1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.786129 4763 generic.go:334] "Generic (PLEG): container finished" podID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerID="3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464" exitCode=0 Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.786179 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq4lj" event={"ID":"08a7c19d-6fd9-491c-a620-6063b89a0dd1","Type":"ContainerDied","Data":"3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464"} Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.786194 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mq4lj" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.786211 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq4lj" event={"ID":"08a7c19d-6fd9-491c-a620-6063b89a0dd1","Type":"ContainerDied","Data":"884169bc152601e59caf4ff1240780e707d49a5e3d191fe18b4e96d32b240186"} Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.786233 4763 scope.go:117] "RemoveContainer" containerID="3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.816458 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mq4lj"] Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.823881 4763 scope.go:117] "RemoveContainer" containerID="f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.827076 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mq4lj"] Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.849277 4763 scope.go:117] "RemoveContainer" containerID="15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.904389 4763 scope.go:117] "RemoveContainer" containerID="3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464" Oct 06 17:00:07 crc kubenswrapper[4763]: E1006 17:00:07.904993 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464\": container with ID starting with 3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464 not found: ID does not exist" containerID="3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.905078 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464"} err="failed to get container status \"3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464\": rpc error: code = NotFound desc = could not find container \"3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464\": container with ID starting with 3b7068d8345098cc40af100dcdc7af7856bdfc19e1a85914673ff1b1e81c2464 not found: ID does not exist" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.905111 4763 scope.go:117] "RemoveContainer" containerID="f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1" Oct 06 17:00:07 crc kubenswrapper[4763]: E1006 17:00:07.905481 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1\": container with ID starting with f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1 not found: ID does not exist" containerID="f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.905519 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1"} err="failed to get container status \"f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1\": rpc error: code = NotFound desc = could not find container \"f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1\": container with ID starting with f3d7632fd5ebab3172b4c6a227cfbcab1ba8b2a07d21f53a37cca2683462acb1 not found: ID does not exist" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.905545 4763 scope.go:117] "RemoveContainer" containerID="15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28" Oct 06 17:00:07 crc kubenswrapper[4763]: E1006 17:00:07.905911 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28\": container with ID starting with 15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28 not found: ID does not exist" containerID="15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28" Oct 06 17:00:07 crc kubenswrapper[4763]: I1006 17:00:07.905942 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28"} err="failed to get container status \"15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28\": rpc error: code = NotFound desc = could not find container \"15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28\": container with ID starting with 15968a8c8730998adc4892d3060f1494ebaeced595d271389e8c87183b3a4a28 not found: ID does not exist" Oct 06 17:00:09 crc kubenswrapper[4763]: I1006 17:00:09.595276 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" path="/var/lib/kubelet/pods/08a7c19d-6fd9-491c-a620-6063b89a0dd1/volumes" Oct 06 17:00:11 crc kubenswrapper[4763]: I1006 17:00:11.576877 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:00:11 crc kubenswrapper[4763]: E1006 17:00:11.577860 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:00:23 crc kubenswrapper[4763]: I1006 17:00:23.581903 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:00:23 crc kubenswrapper[4763]: E1006 17:00:23.583969 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:00:35 crc kubenswrapper[4763]: I1006 17:00:35.025551 4763 scope.go:117] "RemoveContainer" containerID="250e16f53888ff5c48d86b6b9b48112889c2575631ca3778245e6472b34ce0ee" Oct 06 17:00:35 crc kubenswrapper[4763]: I1006 17:00:35.574638 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:00:35 crc kubenswrapper[4763]: E1006 17:00:35.575369 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:00:49 crc kubenswrapper[4763]: I1006 17:00:49.575994 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:00:49 crc kubenswrapper[4763]: E1006 17:00:49.576799 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.184227 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29329501-pqptf"] Oct 06 17:01:00 crc kubenswrapper[4763]: E1006 17:01:00.185434 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81379a99-f3e1-4b23-9022-890539bf5f42" containerName="collect-profiles" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.185457 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="81379a99-f3e1-4b23-9022-890539bf5f42" containerName="collect-profiles" Oct 06 17:01:00 crc kubenswrapper[4763]: E1006 17:01:00.185509 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerName="extract-utilities" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.185521 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerName="extract-utilities" Oct 06 17:01:00 crc kubenswrapper[4763]: E1006 17:01:00.185545 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerName="registry-server" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.185559 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerName="registry-server" Oct 06 17:01:00 crc kubenswrapper[4763]: E1006 17:01:00.185601 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerName="extract-content" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.185644 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerName="extract-content" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.186000 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a7c19d-6fd9-491c-a620-6063b89a0dd1" containerName="registry-server" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.186025 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="81379a99-f3e1-4b23-9022-890539bf5f42" containerName="collect-profiles" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.187226 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.208493 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329501-pqptf"] Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.308437 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-fernet-keys\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.308505 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-combined-ca-bundle\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.308915 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-config-data\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.308994 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxlkv\" (UniqueName: \"kubernetes.io/projected/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-kube-api-access-wxlkv\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.411875 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-fernet-keys\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.411951 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-combined-ca-bundle\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.412124 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxlkv\" (UniqueName: \"kubernetes.io/projected/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-kube-api-access-wxlkv\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.412157 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-config-data\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.422932 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-fernet-keys\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.423736 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-combined-ca-bundle\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.424686 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-config-data\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.450102 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxlkv\" (UniqueName: \"kubernetes.io/projected/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-kube-api-access-wxlkv\") pod \"keystone-cron-29329501-pqptf\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:00 crc kubenswrapper[4763]: I1006 17:01:00.531473 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:01 crc kubenswrapper[4763]: I1006 17:01:01.094543 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329501-pqptf"] Oct 06 17:01:01 crc kubenswrapper[4763]: I1006 17:01:01.479610 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329501-pqptf" event={"ID":"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6","Type":"ContainerStarted","Data":"7661cb5f82e9d55e0c99b1df5ab356c250b2776b22ededbf30c5e53833e7aa28"} Oct 06 17:01:01 crc kubenswrapper[4763]: I1006 17:01:01.479925 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329501-pqptf" event={"ID":"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6","Type":"ContainerStarted","Data":"d7b22c3dc5d060df0501a5243618c9b1399ffc7ccf3a0693c67589e4760a8725"} Oct 06 17:01:01 crc kubenswrapper[4763]: I1006 17:01:01.506481 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29329501-pqptf" podStartSLOduration=1.5064538490000001 podStartE2EDuration="1.506453849s" podCreationTimestamp="2025-10-06 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 17:01:01.500313922 +0000 UTC m=+7658.655606434" watchObservedRunningTime="2025-10-06 17:01:01.506453849 +0000 UTC m=+7658.661746361" Oct 06 17:01:01 crc kubenswrapper[4763]: I1006 17:01:01.575368 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:01:01 crc kubenswrapper[4763]: E1006 17:01:01.575872 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:01:04 crc kubenswrapper[4763]: I1006 17:01:04.517707 4763 generic.go:334] "Generic (PLEG): container finished" podID="f21be0a4-cc49-4857-a4b6-77bfa60ee1b6" containerID="7661cb5f82e9d55e0c99b1df5ab356c250b2776b22ededbf30c5e53833e7aa28" exitCode=0 Oct 06 17:01:04 crc kubenswrapper[4763]: I1006 17:01:04.517839 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329501-pqptf" event={"ID":"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6","Type":"ContainerDied","Data":"7661cb5f82e9d55e0c99b1df5ab356c250b2776b22ededbf30c5e53833e7aa28"} Oct 06 17:01:05 crc kubenswrapper[4763]: I1006 17:01:05.990278 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.057212 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-config-data\") pod \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.057253 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-combined-ca-bundle\") pod \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.057315 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxlkv\" (UniqueName: \"kubernetes.io/projected/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-kube-api-access-wxlkv\") pod \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.057402 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-fernet-keys\") pod \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\" (UID: \"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6\") " Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.065890 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f21be0a4-cc49-4857-a4b6-77bfa60ee1b6" (UID: "f21be0a4-cc49-4857-a4b6-77bfa60ee1b6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.066379 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-kube-api-access-wxlkv" (OuterVolumeSpecName: "kube-api-access-wxlkv") pod "f21be0a4-cc49-4857-a4b6-77bfa60ee1b6" (UID: "f21be0a4-cc49-4857-a4b6-77bfa60ee1b6"). InnerVolumeSpecName "kube-api-access-wxlkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.090276 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f21be0a4-cc49-4857-a4b6-77bfa60ee1b6" (UID: "f21be0a4-cc49-4857-a4b6-77bfa60ee1b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.127587 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-config-data" (OuterVolumeSpecName: "config-data") pod "f21be0a4-cc49-4857-a4b6-77bfa60ee1b6" (UID: "f21be0a4-cc49-4857-a4b6-77bfa60ee1b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.159509 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.159559 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.159577 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.159596 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxlkv\" (UniqueName: \"kubernetes.io/projected/f21be0a4-cc49-4857-a4b6-77bfa60ee1b6-kube-api-access-wxlkv\") on node \"crc\" DevicePath \"\"" Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.579088 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329501-pqptf" event={"ID":"f21be0a4-cc49-4857-a4b6-77bfa60ee1b6","Type":"ContainerDied","Data":"d7b22c3dc5d060df0501a5243618c9b1399ffc7ccf3a0693c67589e4760a8725"} Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.579161 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7b22c3dc5d060df0501a5243618c9b1399ffc7ccf3a0693c67589e4760a8725" Oct 06 17:01:06 crc kubenswrapper[4763]: I1006 17:01:06.579317 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329501-pqptf" Oct 06 17:01:06 crc kubenswrapper[4763]: E1006 17:01:06.740945 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf21be0a4_cc49_4857_a4b6_77bfa60ee1b6.slice/crio-d7b22c3dc5d060df0501a5243618c9b1399ffc7ccf3a0693c67589e4760a8725\": RecentStats: unable to find data in memory cache]" Oct 06 17:01:12 crc kubenswrapper[4763]: I1006 17:01:12.575489 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:01:12 crc kubenswrapper[4763]: E1006 17:01:12.576836 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:01:24 crc kubenswrapper[4763]: I1006 17:01:24.576185 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:01:24 crc kubenswrapper[4763]: E1006 17:01:24.577996 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:01:38 crc kubenswrapper[4763]: I1006 17:01:38.576059 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:01:38 crc kubenswrapper[4763]: E1006 17:01:38.577378 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:01:49 crc kubenswrapper[4763]: I1006 17:01:49.576248 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:01:49 crc kubenswrapper[4763]: E1006 17:01:49.577531 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:02:04 crc kubenswrapper[4763]: I1006 17:02:04.575955 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:02:04 crc kubenswrapper[4763]: E1006 17:02:04.578727 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:02:18 crc kubenswrapper[4763]: I1006 17:02:18.575345 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:02:18 crc kubenswrapper[4763]: E1006 17:02:18.576402 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:02:33 crc kubenswrapper[4763]: I1006 17:02:33.586533 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:02:33 crc kubenswrapper[4763]: E1006 17:02:33.587437 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:02:48 crc kubenswrapper[4763]: I1006 17:02:48.574895 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:02:49 crc kubenswrapper[4763]: I1006 17:02:49.766876 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"a962a52b96c94602c5735d4e53b4d583e93eac177e3992174806523adf4309e4"} Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.720800 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9q46p"] Oct 06 17:04:06 crc kubenswrapper[4763]: E1006 17:04:06.721923 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21be0a4-cc49-4857-a4b6-77bfa60ee1b6" containerName="keystone-cron" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.721940 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21be0a4-cc49-4857-a4b6-77bfa60ee1b6" containerName="keystone-cron" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.722254 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21be0a4-cc49-4857-a4b6-77bfa60ee1b6" containerName="keystone-cron" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.724191 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.742379 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9q46p"] Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.869657 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbn4v\" (UniqueName: \"kubernetes.io/projected/384a506b-32ef-4544-b178-5cdcee906a68-kube-api-access-xbn4v\") pod \"community-operators-9q46p\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.869869 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-catalog-content\") pod \"community-operators-9q46p\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.870067 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-utilities\") pod \"community-operators-9q46p\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.971968 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-catalog-content\") pod \"community-operators-9q46p\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.972106 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-utilities\") pod \"community-operators-9q46p\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.972384 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbn4v\" (UniqueName: \"kubernetes.io/projected/384a506b-32ef-4544-b178-5cdcee906a68-kube-api-access-xbn4v\") pod \"community-operators-9q46p\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.973365 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-catalog-content\") pod \"community-operators-9q46p\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:06 crc kubenswrapper[4763]: I1006 17:04:06.973488 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-utilities\") pod \"community-operators-9q46p\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:07 crc kubenswrapper[4763]: I1006 17:04:07.004846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbn4v\" (UniqueName: \"kubernetes.io/projected/384a506b-32ef-4544-b178-5cdcee906a68-kube-api-access-xbn4v\") pod \"community-operators-9q46p\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:07 crc kubenswrapper[4763]: I1006 17:04:07.086433 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:07 crc kubenswrapper[4763]: I1006 17:04:07.632157 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9q46p"] Oct 06 17:04:07 crc kubenswrapper[4763]: I1006 17:04:07.695814 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9q46p" event={"ID":"384a506b-32ef-4544-b178-5cdcee906a68","Type":"ContainerStarted","Data":"2827dd8c8494872a6f1f921c902208ce20d02aee92ec7fa8affa5cabdd2aeeff"} Oct 06 17:04:08 crc kubenswrapper[4763]: I1006 17:04:08.726318 4763 generic.go:334] "Generic (PLEG): container finished" podID="384a506b-32ef-4544-b178-5cdcee906a68" containerID="634a763aea2565fc9b57783e682b506f3c1193c757304c65698474478bbb2997" exitCode=0 Oct 06 17:04:08 crc kubenswrapper[4763]: I1006 17:04:08.726715 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9q46p" event={"ID":"384a506b-32ef-4544-b178-5cdcee906a68","Type":"ContainerDied","Data":"634a763aea2565fc9b57783e682b506f3c1193c757304c65698474478bbb2997"} Oct 06 17:04:08 crc kubenswrapper[4763]: I1006 17:04:08.732434 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 17:04:09 crc kubenswrapper[4763]: I1006 17:04:09.744341 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9q46p" event={"ID":"384a506b-32ef-4544-b178-5cdcee906a68","Type":"ContainerStarted","Data":"ba537581a0f546a0b0fbeab03a1c16158d70f253a1ef4c0b9d660650458c8c2e"} Oct 06 17:04:11 crc kubenswrapper[4763]: I1006 17:04:11.771679 4763 generic.go:334] "Generic (PLEG): container finished" podID="384a506b-32ef-4544-b178-5cdcee906a68" containerID="ba537581a0f546a0b0fbeab03a1c16158d70f253a1ef4c0b9d660650458c8c2e" exitCode=0 Oct 06 17:04:11 crc kubenswrapper[4763]: I1006 17:04:11.771785 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9q46p" event={"ID":"384a506b-32ef-4544-b178-5cdcee906a68","Type":"ContainerDied","Data":"ba537581a0f546a0b0fbeab03a1c16158d70f253a1ef4c0b9d660650458c8c2e"} Oct 06 17:04:12 crc kubenswrapper[4763]: I1006 17:04:12.793562 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9q46p" event={"ID":"384a506b-32ef-4544-b178-5cdcee906a68","Type":"ContainerStarted","Data":"0a08aef9f048c1cf93de42556894d037adffde236b8b3c5a57023d12bb971fee"} Oct 06 17:04:12 crc kubenswrapper[4763]: I1006 17:04:12.825824 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9q46p" podStartSLOduration=3.076457165 podStartE2EDuration="6.825805908s" podCreationTimestamp="2025-10-06 17:04:06 +0000 UTC" firstStartedPulling="2025-10-06 17:04:08.731868555 +0000 UTC m=+7845.887161097" lastFinishedPulling="2025-10-06 17:04:12.481217328 +0000 UTC m=+7849.636509840" observedRunningTime="2025-10-06 17:04:12.823591538 +0000 UTC m=+7849.978884150" watchObservedRunningTime="2025-10-06 17:04:12.825805908 +0000 UTC m=+7849.981098430" Oct 06 17:04:17 crc kubenswrapper[4763]: I1006 17:04:17.086734 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:17 crc kubenswrapper[4763]: I1006 17:04:17.087658 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:17 crc kubenswrapper[4763]: I1006 17:04:17.171653 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:17 crc kubenswrapper[4763]: I1006 17:04:17.924677 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:20 crc kubenswrapper[4763]: I1006 17:04:20.308097 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9q46p"] Oct 06 17:04:20 crc kubenswrapper[4763]: I1006 17:04:20.309013 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9q46p" podUID="384a506b-32ef-4544-b178-5cdcee906a68" containerName="registry-server" containerID="cri-o://0a08aef9f048c1cf93de42556894d037adffde236b8b3c5a57023d12bb971fee" gracePeriod=2 Oct 06 17:04:20 crc kubenswrapper[4763]: I1006 17:04:20.909810 4763 generic.go:334] "Generic (PLEG): container finished" podID="384a506b-32ef-4544-b178-5cdcee906a68" containerID="0a08aef9f048c1cf93de42556894d037adffde236b8b3c5a57023d12bb971fee" exitCode=0 Oct 06 17:04:20 crc kubenswrapper[4763]: I1006 17:04:20.910106 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9q46p" event={"ID":"384a506b-32ef-4544-b178-5cdcee906a68","Type":"ContainerDied","Data":"0a08aef9f048c1cf93de42556894d037adffde236b8b3c5a57023d12bb971fee"} Oct 06 17:04:20 crc kubenswrapper[4763]: I1006 17:04:20.910139 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9q46p" event={"ID":"384a506b-32ef-4544-b178-5cdcee906a68","Type":"ContainerDied","Data":"2827dd8c8494872a6f1f921c902208ce20d02aee92ec7fa8affa5cabdd2aeeff"} Oct 06 17:04:20 crc kubenswrapper[4763]: I1006 17:04:20.910153 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2827dd8c8494872a6f1f921c902208ce20d02aee92ec7fa8affa5cabdd2aeeff" Oct 06 17:04:20 crc kubenswrapper[4763]: I1006 17:04:20.996165 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.145373 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-utilities\") pod \"384a506b-32ef-4544-b178-5cdcee906a68\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.145551 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbn4v\" (UniqueName: \"kubernetes.io/projected/384a506b-32ef-4544-b178-5cdcee906a68-kube-api-access-xbn4v\") pod \"384a506b-32ef-4544-b178-5cdcee906a68\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.145821 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-catalog-content\") pod \"384a506b-32ef-4544-b178-5cdcee906a68\" (UID: \"384a506b-32ef-4544-b178-5cdcee906a68\") " Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.146173 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-utilities" (OuterVolumeSpecName: "utilities") pod "384a506b-32ef-4544-b178-5cdcee906a68" (UID: "384a506b-32ef-4544-b178-5cdcee906a68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.146884 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.151849 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384a506b-32ef-4544-b178-5cdcee906a68-kube-api-access-xbn4v" (OuterVolumeSpecName: "kube-api-access-xbn4v") pod "384a506b-32ef-4544-b178-5cdcee906a68" (UID: "384a506b-32ef-4544-b178-5cdcee906a68"). InnerVolumeSpecName "kube-api-access-xbn4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.225670 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "384a506b-32ef-4544-b178-5cdcee906a68" (UID: "384a506b-32ef-4544-b178-5cdcee906a68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.249663 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbn4v\" (UniqueName: \"kubernetes.io/projected/384a506b-32ef-4544-b178-5cdcee906a68-kube-api-access-xbn4v\") on node \"crc\" DevicePath \"\"" Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.249710 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384a506b-32ef-4544-b178-5cdcee906a68-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.922489 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9q46p" Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.957043 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9q46p"] Oct 06 17:04:21 crc kubenswrapper[4763]: I1006 17:04:21.972948 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9q46p"] Oct 06 17:04:23 crc kubenswrapper[4763]: I1006 17:04:23.596301 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="384a506b-32ef-4544-b178-5cdcee906a68" path="/var/lib/kubelet/pods/384a506b-32ef-4544-b178-5cdcee906a68/volumes" Oct 06 17:05:03 crc kubenswrapper[4763]: I1006 17:05:03.877186 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 17:05:03 crc kubenswrapper[4763]: I1006 17:05:03.878066 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 17:05:33 crc kubenswrapper[4763]: I1006 17:05:33.877453 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 17:05:33 crc kubenswrapper[4763]: I1006 17:05:33.878821 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 17:06:03 crc kubenswrapper[4763]: I1006 17:06:03.876937 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 17:06:03 crc kubenswrapper[4763]: I1006 17:06:03.878249 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 17:06:03 crc kubenswrapper[4763]: I1006 17:06:03.878319 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 17:06:03 crc kubenswrapper[4763]: I1006 17:06:03.879288 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a962a52b96c94602c5735d4e53b4d583e93eac177e3992174806523adf4309e4"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 17:06:03 crc kubenswrapper[4763]: I1006 17:06:03.879368 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://a962a52b96c94602c5735d4e53b4d583e93eac177e3992174806523adf4309e4" gracePeriod=600 Oct 06 17:06:04 crc kubenswrapper[4763]: I1006 17:06:04.178378 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="a962a52b96c94602c5735d4e53b4d583e93eac177e3992174806523adf4309e4" exitCode=0 Oct 06 17:06:04 crc kubenswrapper[4763]: I1006 17:06:04.178471 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"a962a52b96c94602c5735d4e53b4d583e93eac177e3992174806523adf4309e4"} Oct 06 17:06:04 crc kubenswrapper[4763]: I1006 17:06:04.178702 4763 scope.go:117] "RemoveContainer" containerID="ad90e2c2f506a39994c7f693c6645c5d97a3e8e8f03547d2c94619690211bbed" Oct 06 17:06:05 crc kubenswrapper[4763]: I1006 17:06:05.202049 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4"} Oct 06 17:08:33 crc kubenswrapper[4763]: I1006 17:08:33.876707 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 17:08:33 crc kubenswrapper[4763]: I1006 17:08:33.877355 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 17:09:03 crc kubenswrapper[4763]: I1006 17:09:03.876828 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 17:09:03 crc kubenswrapper[4763]: I1006 17:09:03.877574 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.694128 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtngj"] Oct 06 17:09:05 crc kubenswrapper[4763]: E1006 17:09:05.695180 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384a506b-32ef-4544-b178-5cdcee906a68" containerName="extract-content" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.695202 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="384a506b-32ef-4544-b178-5cdcee906a68" containerName="extract-content" Oct 06 17:09:05 crc kubenswrapper[4763]: E1006 17:09:05.695221 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384a506b-32ef-4544-b178-5cdcee906a68" containerName="extract-utilities" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.695231 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="384a506b-32ef-4544-b178-5cdcee906a68" containerName="extract-utilities" Oct 06 17:09:05 crc kubenswrapper[4763]: E1006 17:09:05.695269 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384a506b-32ef-4544-b178-5cdcee906a68" containerName="registry-server" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.695280 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="384a506b-32ef-4544-b178-5cdcee906a68" containerName="registry-server" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.695660 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="384a506b-32ef-4544-b178-5cdcee906a68" containerName="registry-server" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.704315 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.712921 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtngj"] Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.849451 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-catalog-content\") pod \"redhat-operators-jtngj\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.849557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q69h5\" (UniqueName: \"kubernetes.io/projected/2dfee9dc-8584-4a02-a490-01ed40286b94-kube-api-access-q69h5\") pod \"redhat-operators-jtngj\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.849837 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-utilities\") pod \"redhat-operators-jtngj\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.952745 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-catalog-content\") pod \"redhat-operators-jtngj\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.952849 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q69h5\" (UniqueName: \"kubernetes.io/projected/2dfee9dc-8584-4a02-a490-01ed40286b94-kube-api-access-q69h5\") pod \"redhat-operators-jtngj\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.952953 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-utilities\") pod \"redhat-operators-jtngj\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.953509 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-catalog-content\") pod \"redhat-operators-jtngj\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.954064 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-utilities\") pod \"redhat-operators-jtngj\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:05 crc kubenswrapper[4763]: I1006 17:09:05.974975 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q69h5\" (UniqueName: \"kubernetes.io/projected/2dfee9dc-8584-4a02-a490-01ed40286b94-kube-api-access-q69h5\") pod \"redhat-operators-jtngj\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:06 crc kubenswrapper[4763]: I1006 17:09:06.024918 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:06 crc kubenswrapper[4763]: I1006 17:09:06.520959 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtngj"] Oct 06 17:09:07 crc kubenswrapper[4763]: I1006 17:09:07.525971 4763 generic.go:334] "Generic (PLEG): container finished" podID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerID="76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186" exitCode=0 Oct 06 17:09:07 crc kubenswrapper[4763]: I1006 17:09:07.526144 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtngj" event={"ID":"2dfee9dc-8584-4a02-a490-01ed40286b94","Type":"ContainerDied","Data":"76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186"} Oct 06 17:09:07 crc kubenswrapper[4763]: I1006 17:09:07.526594 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtngj" event={"ID":"2dfee9dc-8584-4a02-a490-01ed40286b94","Type":"ContainerStarted","Data":"094d45af570b14a7ba10da896e3c7352835ac6b542a4d05f1be2a53570b50c6c"} Oct 06 17:09:08 crc kubenswrapper[4763]: I1006 17:09:08.538523 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtngj" event={"ID":"2dfee9dc-8584-4a02-a490-01ed40286b94","Type":"ContainerStarted","Data":"68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40"} Oct 06 17:09:12 crc kubenswrapper[4763]: I1006 17:09:12.585453 4763 generic.go:334] "Generic (PLEG): container finished" podID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerID="68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40" exitCode=0 Oct 06 17:09:12 crc kubenswrapper[4763]: I1006 17:09:12.585517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtngj" event={"ID":"2dfee9dc-8584-4a02-a490-01ed40286b94","Type":"ContainerDied","Data":"68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40"} Oct 06 17:09:12 crc kubenswrapper[4763]: I1006 17:09:12.590206 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 17:09:13 crc kubenswrapper[4763]: I1006 17:09:13.602810 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtngj" event={"ID":"2dfee9dc-8584-4a02-a490-01ed40286b94","Type":"ContainerStarted","Data":"41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b"} Oct 06 17:09:13 crc kubenswrapper[4763]: I1006 17:09:13.631272 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtngj" podStartSLOduration=3.161216553 podStartE2EDuration="8.631247997s" podCreationTimestamp="2025-10-06 17:09:05 +0000 UTC" firstStartedPulling="2025-10-06 17:09:07.528729528 +0000 UTC m=+8144.684022040" lastFinishedPulling="2025-10-06 17:09:12.998760952 +0000 UTC m=+8150.154053484" observedRunningTime="2025-10-06 17:09:13.621391392 +0000 UTC m=+8150.776683924" watchObservedRunningTime="2025-10-06 17:09:13.631247997 +0000 UTC m=+8150.786540519" Oct 06 17:09:16 crc kubenswrapper[4763]: I1006 17:09:16.026120 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:16 crc kubenswrapper[4763]: I1006 17:09:16.027556 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:17 crc kubenswrapper[4763]: I1006 17:09:17.117851 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jtngj" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerName="registry-server" probeResult="failure" output=< Oct 06 17:09:17 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Oct 06 17:09:17 crc kubenswrapper[4763]: > Oct 06 17:09:27 crc kubenswrapper[4763]: I1006 17:09:27.125023 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jtngj" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerName="registry-server" probeResult="failure" output=< Oct 06 17:09:27 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Oct 06 17:09:27 crc kubenswrapper[4763]: > Oct 06 17:09:33 crc kubenswrapper[4763]: I1006 17:09:33.876811 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 17:09:33 crc kubenswrapper[4763]: I1006 17:09:33.877652 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 17:09:33 crc kubenswrapper[4763]: I1006 17:09:33.877726 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 17:09:33 crc kubenswrapper[4763]: I1006 17:09:33.878985 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 17:09:33 crc kubenswrapper[4763]: I1006 17:09:33.879088 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" gracePeriod=600 Oct 06 17:09:34 crc kubenswrapper[4763]: E1006 17:09:34.047693 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:09:34 crc kubenswrapper[4763]: I1006 17:09:34.879150 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" exitCode=0 Oct 06 17:09:34 crc kubenswrapper[4763]: I1006 17:09:34.879218 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4"} Oct 06 17:09:34 crc kubenswrapper[4763]: I1006 17:09:34.879263 4763 scope.go:117] "RemoveContainer" containerID="a962a52b96c94602c5735d4e53b4d583e93eac177e3992174806523adf4309e4" Oct 06 17:09:34 crc kubenswrapper[4763]: I1006 17:09:34.880181 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:09:34 crc kubenswrapper[4763]: E1006 17:09:34.880731 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:09:36 crc kubenswrapper[4763]: I1006 17:09:36.085875 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:36 crc kubenswrapper[4763]: I1006 17:09:36.205793 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:36 crc kubenswrapper[4763]: I1006 17:09:36.896968 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtngj"] Oct 06 17:09:37 crc kubenswrapper[4763]: I1006 17:09:37.921114 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtngj" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerName="registry-server" containerID="cri-o://41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b" gracePeriod=2 Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.423233 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.527905 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-utilities\") pod \"2dfee9dc-8584-4a02-a490-01ed40286b94\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.527976 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-catalog-content\") pod \"2dfee9dc-8584-4a02-a490-01ed40286b94\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.528012 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q69h5\" (UniqueName: \"kubernetes.io/projected/2dfee9dc-8584-4a02-a490-01ed40286b94-kube-api-access-q69h5\") pod \"2dfee9dc-8584-4a02-a490-01ed40286b94\" (UID: \"2dfee9dc-8584-4a02-a490-01ed40286b94\") " Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.528858 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-utilities" (OuterVolumeSpecName: "utilities") pod "2dfee9dc-8584-4a02-a490-01ed40286b94" (UID: "2dfee9dc-8584-4a02-a490-01ed40286b94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.534600 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfee9dc-8584-4a02-a490-01ed40286b94-kube-api-access-q69h5" (OuterVolumeSpecName: "kube-api-access-q69h5") pod "2dfee9dc-8584-4a02-a490-01ed40286b94" (UID: "2dfee9dc-8584-4a02-a490-01ed40286b94"). InnerVolumeSpecName "kube-api-access-q69h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.622838 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dfee9dc-8584-4a02-a490-01ed40286b94" (UID: "2dfee9dc-8584-4a02-a490-01ed40286b94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.631492 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.631555 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dfee9dc-8584-4a02-a490-01ed40286b94-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.631588 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q69h5\" (UniqueName: \"kubernetes.io/projected/2dfee9dc-8584-4a02-a490-01ed40286b94-kube-api-access-q69h5\") on node \"crc\" DevicePath \"\"" Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.933894 4763 generic.go:334] "Generic (PLEG): container finished" podID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerID="41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b" exitCode=0 Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.933978 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtngj" event={"ID":"2dfee9dc-8584-4a02-a490-01ed40286b94","Type":"ContainerDied","Data":"41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b"} Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.934254 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtngj" event={"ID":"2dfee9dc-8584-4a02-a490-01ed40286b94","Type":"ContainerDied","Data":"094d45af570b14a7ba10da896e3c7352835ac6b542a4d05f1be2a53570b50c6c"} Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.934277 4763 scope.go:117] "RemoveContainer" containerID="41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b" Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.933999 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtngj" Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.974287 4763 scope.go:117] "RemoveContainer" containerID="68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40" Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.983179 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtngj"] Oct 06 17:09:38 crc kubenswrapper[4763]: I1006 17:09:38.993889 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtngj"] Oct 06 17:09:39 crc kubenswrapper[4763]: I1006 17:09:39.001954 4763 scope.go:117] "RemoveContainer" containerID="76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186" Oct 06 17:09:39 crc kubenswrapper[4763]: I1006 17:09:39.074599 4763 scope.go:117] "RemoveContainer" containerID="41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b" Oct 06 17:09:39 crc kubenswrapper[4763]: E1006 17:09:39.075128 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b\": container with ID starting with 41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b not found: ID does not exist" containerID="41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b" Oct 06 17:09:39 crc kubenswrapper[4763]: I1006 17:09:39.075181 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b"} err="failed to get container status \"41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b\": rpc error: code = NotFound desc = could not find container \"41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b\": container with ID starting with 41d7ef27231ff2c269a8794d4b0b67377bd546772ead910e3080674ce0ff5d2b not found: ID does not exist" Oct 06 17:09:39 crc kubenswrapper[4763]: I1006 17:09:39.075215 4763 scope.go:117] "RemoveContainer" containerID="68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40" Oct 06 17:09:39 crc kubenswrapper[4763]: E1006 17:09:39.075836 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40\": container with ID starting with 68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40 not found: ID does not exist" containerID="68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40" Oct 06 17:09:39 crc kubenswrapper[4763]: I1006 17:09:39.075895 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40"} err="failed to get container status \"68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40\": rpc error: code = NotFound desc = could not find container \"68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40\": container with ID starting with 68d803c04d1df4edd814dcdd63871ff254d8ffdd26bd6d40c9716226ac9bca40 not found: ID does not exist" Oct 06 17:09:39 crc kubenswrapper[4763]: I1006 17:09:39.075941 4763 scope.go:117] "RemoveContainer" containerID="76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186" Oct 06 17:09:39 crc kubenswrapper[4763]: E1006 17:09:39.076284 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186\": container with ID starting with 76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186 not found: ID does not exist" containerID="76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186" Oct 06 17:09:39 crc kubenswrapper[4763]: I1006 17:09:39.076316 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186"} err="failed to get container status \"76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186\": rpc error: code = NotFound desc = could not find container \"76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186\": container with ID starting with 76e3b2bfeee73ffee85628ee0256e98912e799f79fe56e90f03a272575bc8186 not found: ID does not exist" Oct 06 17:09:39 crc kubenswrapper[4763]: I1006 17:09:39.591072 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" path="/var/lib/kubelet/pods/2dfee9dc-8584-4a02-a490-01ed40286b94/volumes" Oct 06 17:09:47 crc kubenswrapper[4763]: I1006 17:09:47.575437 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:09:47 crc kubenswrapper[4763]: E1006 17:09:47.578833 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:09:58 crc kubenswrapper[4763]: I1006 17:09:58.576503 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:09:58 crc kubenswrapper[4763]: E1006 17:09:58.577603 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:10:12 crc kubenswrapper[4763]: I1006 17:10:12.577202 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:10:12 crc kubenswrapper[4763]: E1006 17:10:12.579010 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.219334 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vwsqt/must-gather-vk9mz"] Oct 06 17:10:16 crc kubenswrapper[4763]: E1006 17:10:16.220402 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerName="extract-content" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.220418 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerName="extract-content" Oct 06 17:10:16 crc kubenswrapper[4763]: E1006 17:10:16.220433 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerName="extract-utilities" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.220439 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerName="extract-utilities" Oct 06 17:10:16 crc kubenswrapper[4763]: E1006 17:10:16.220450 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerName="registry-server" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.220456 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerName="registry-server" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.220694 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dfee9dc-8584-4a02-a490-01ed40286b94" containerName="registry-server" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.222053 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/must-gather-vk9mz" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.232444 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vwsqt"/"default-dockercfg-8p7l6" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.232763 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vwsqt"/"openshift-service-ca.crt" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.233875 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vwsqt"/"kube-root-ca.crt" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.246995 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vwsqt/must-gather-vk9mz"] Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.422300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcwbh\" (UniqueName: \"kubernetes.io/projected/5d3ed696-244b-4625-aa3b-74471b9c059e-kube-api-access-gcwbh\") pod \"must-gather-vk9mz\" (UID: \"5d3ed696-244b-4625-aa3b-74471b9c059e\") " pod="openshift-must-gather-vwsqt/must-gather-vk9mz" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.423009 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5d3ed696-244b-4625-aa3b-74471b9c059e-must-gather-output\") pod \"must-gather-vk9mz\" (UID: \"5d3ed696-244b-4625-aa3b-74471b9c059e\") " pod="openshift-must-gather-vwsqt/must-gather-vk9mz" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.525484 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcwbh\" (UniqueName: \"kubernetes.io/projected/5d3ed696-244b-4625-aa3b-74471b9c059e-kube-api-access-gcwbh\") pod \"must-gather-vk9mz\" (UID: \"5d3ed696-244b-4625-aa3b-74471b9c059e\") " pod="openshift-must-gather-vwsqt/must-gather-vk9mz" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.525665 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5d3ed696-244b-4625-aa3b-74471b9c059e-must-gather-output\") pod \"must-gather-vk9mz\" (UID: \"5d3ed696-244b-4625-aa3b-74471b9c059e\") " pod="openshift-must-gather-vwsqt/must-gather-vk9mz" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.526278 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5d3ed696-244b-4625-aa3b-74471b9c059e-must-gather-output\") pod \"must-gather-vk9mz\" (UID: \"5d3ed696-244b-4625-aa3b-74471b9c059e\") " pod="openshift-must-gather-vwsqt/must-gather-vk9mz" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.543027 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcwbh\" (UniqueName: \"kubernetes.io/projected/5d3ed696-244b-4625-aa3b-74471b9c059e-kube-api-access-gcwbh\") pod \"must-gather-vk9mz\" (UID: \"5d3ed696-244b-4625-aa3b-74471b9c059e\") " pod="openshift-must-gather-vwsqt/must-gather-vk9mz" Oct 06 17:10:16 crc kubenswrapper[4763]: I1006 17:10:16.569757 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/must-gather-vk9mz" Oct 06 17:10:17 crc kubenswrapper[4763]: I1006 17:10:17.101191 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vwsqt/must-gather-vk9mz"] Oct 06 17:10:17 crc kubenswrapper[4763]: I1006 17:10:17.372334 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/must-gather-vk9mz" event={"ID":"5d3ed696-244b-4625-aa3b-74471b9c059e","Type":"ContainerStarted","Data":"623cb125e0b6649b07da9c04a69ce49e425e33301e76d53f2defac5ff281e774"} Oct 06 17:10:22 crc kubenswrapper[4763]: I1006 17:10:22.433157 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/must-gather-vk9mz" event={"ID":"5d3ed696-244b-4625-aa3b-74471b9c059e","Type":"ContainerStarted","Data":"1295a6c4a8ac0ef04b65fa50f2f7318d754908a78113e8c7666792acbb4d0e1c"} Oct 06 17:10:22 crc kubenswrapper[4763]: I1006 17:10:22.433774 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/must-gather-vk9mz" event={"ID":"5d3ed696-244b-4625-aa3b-74471b9c059e","Type":"ContainerStarted","Data":"f14281a272c6d6eb499b74acfa60628bba07d84be3da354162868356c12b9133"} Oct 06 17:10:22 crc kubenswrapper[4763]: I1006 17:10:22.465666 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vwsqt/must-gather-vk9mz" podStartSLOduration=1.9459892399999998 podStartE2EDuration="6.465611471s" podCreationTimestamp="2025-10-06 17:10:16 +0000 UTC" firstStartedPulling="2025-10-06 17:10:17.122113816 +0000 UTC m=+8214.277406328" lastFinishedPulling="2025-10-06 17:10:21.641736047 +0000 UTC m=+8218.797028559" observedRunningTime="2025-10-06 17:10:22.453444833 +0000 UTC m=+8219.608737345" watchObservedRunningTime="2025-10-06 17:10:22.465611471 +0000 UTC m=+8219.620903993" Oct 06 17:10:25 crc kubenswrapper[4763]: I1006 17:10:25.575224 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:10:25 crc kubenswrapper[4763]: E1006 17:10:25.576012 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:10:25 crc kubenswrapper[4763]: I1006 17:10:25.938264 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vwsqt/crc-debug-5hz96"] Oct 06 17:10:25 crc kubenswrapper[4763]: I1006 17:10:25.943848 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-5hz96" Oct 06 17:10:26 crc kubenswrapper[4763]: I1006 17:10:26.132415 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdnn9\" (UniqueName: \"kubernetes.io/projected/90fefb8a-8138-41bc-b71f-0599e6599fc2-kube-api-access-vdnn9\") pod \"crc-debug-5hz96\" (UID: \"90fefb8a-8138-41bc-b71f-0599e6599fc2\") " pod="openshift-must-gather-vwsqt/crc-debug-5hz96" Oct 06 17:10:26 crc kubenswrapper[4763]: I1006 17:10:26.132764 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90fefb8a-8138-41bc-b71f-0599e6599fc2-host\") pod \"crc-debug-5hz96\" (UID: \"90fefb8a-8138-41bc-b71f-0599e6599fc2\") " pod="openshift-must-gather-vwsqt/crc-debug-5hz96" Oct 06 17:10:26 crc kubenswrapper[4763]: I1006 17:10:26.234187 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90fefb8a-8138-41bc-b71f-0599e6599fc2-host\") pod \"crc-debug-5hz96\" (UID: \"90fefb8a-8138-41bc-b71f-0599e6599fc2\") " pod="openshift-must-gather-vwsqt/crc-debug-5hz96" Oct 06 17:10:26 crc kubenswrapper[4763]: I1006 17:10:26.234554 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdnn9\" (UniqueName: \"kubernetes.io/projected/90fefb8a-8138-41bc-b71f-0599e6599fc2-kube-api-access-vdnn9\") pod \"crc-debug-5hz96\" (UID: \"90fefb8a-8138-41bc-b71f-0599e6599fc2\") " pod="openshift-must-gather-vwsqt/crc-debug-5hz96" Oct 06 17:10:26 crc kubenswrapper[4763]: I1006 17:10:26.234953 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90fefb8a-8138-41bc-b71f-0599e6599fc2-host\") pod \"crc-debug-5hz96\" (UID: \"90fefb8a-8138-41bc-b71f-0599e6599fc2\") " pod="openshift-must-gather-vwsqt/crc-debug-5hz96" Oct 06 17:10:26 crc kubenswrapper[4763]: I1006 17:10:26.261150 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdnn9\" (UniqueName: \"kubernetes.io/projected/90fefb8a-8138-41bc-b71f-0599e6599fc2-kube-api-access-vdnn9\") pod \"crc-debug-5hz96\" (UID: \"90fefb8a-8138-41bc-b71f-0599e6599fc2\") " pod="openshift-must-gather-vwsqt/crc-debug-5hz96" Oct 06 17:10:26 crc kubenswrapper[4763]: I1006 17:10:26.560990 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-5hz96" Oct 06 17:10:26 crc kubenswrapper[4763]: W1006 17:10:26.629816 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fefb8a_8138_41bc_b71f_0599e6599fc2.slice/crio-63a57da4a87cbe311398e8bf34e21d3648dab4c74f4698e3c0568deb82f7a538 WatchSource:0}: Error finding container 63a57da4a87cbe311398e8bf34e21d3648dab4c74f4698e3c0568deb82f7a538: Status 404 returned error can't find the container with id 63a57da4a87cbe311398e8bf34e21d3648dab4c74f4698e3c0568deb82f7a538 Oct 06 17:10:27 crc kubenswrapper[4763]: I1006 17:10:27.478455 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/crc-debug-5hz96" event={"ID":"90fefb8a-8138-41bc-b71f-0599e6599fc2","Type":"ContainerStarted","Data":"63a57da4a87cbe311398e8bf34e21d3648dab4c74f4698e3c0568deb82f7a538"} Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.401319 4763 scope.go:117] "RemoveContainer" containerID="0a08aef9f048c1cf93de42556894d037adffde236b8b3c5a57023d12bb971fee" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.609335 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lnccv"] Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.612159 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.641103 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lnccv"] Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.716464 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-catalog-content\") pod \"certified-operators-lnccv\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.716540 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx2dr\" (UniqueName: \"kubernetes.io/projected/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-kube-api-access-bx2dr\") pod \"certified-operators-lnccv\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.716595 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-utilities\") pod \"certified-operators-lnccv\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.818886 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-catalog-content\") pod \"certified-operators-lnccv\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.818970 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx2dr\" (UniqueName: \"kubernetes.io/projected/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-kube-api-access-bx2dr\") pod \"certified-operators-lnccv\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.819025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-utilities\") pod \"certified-operators-lnccv\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.819345 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-catalog-content\") pod \"certified-operators-lnccv\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.819449 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-utilities\") pod \"certified-operators-lnccv\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.837629 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx2dr\" (UniqueName: \"kubernetes.io/projected/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-kube-api-access-bx2dr\") pod \"certified-operators-lnccv\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:35 crc kubenswrapper[4763]: I1006 17:10:35.947021 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:38 crc kubenswrapper[4763]: I1006 17:10:38.303481 4763 scope.go:117] "RemoveContainer" containerID="634a763aea2565fc9b57783e682b506f3c1193c757304c65698474478bbb2997" Oct 06 17:10:38 crc kubenswrapper[4763]: I1006 17:10:38.447333 4763 scope.go:117] "RemoveContainer" containerID="ba537581a0f546a0b0fbeab03a1c16158d70f253a1ef4c0b9d660650458c8c2e" Oct 06 17:10:38 crc kubenswrapper[4763]: I1006 17:10:38.837134 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lnccv"] Oct 06 17:10:38 crc kubenswrapper[4763]: W1006 17:10:38.846122 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b5ecb52_a6c3_4381_b79f_5ff5ab1f1bf4.slice/crio-3e7fffdae0ccaec87fd4f224e4a50b5cd5dc84dfc797c5f34edca864f336bd8e WatchSource:0}: Error finding container 3e7fffdae0ccaec87fd4f224e4a50b5cd5dc84dfc797c5f34edca864f336bd8e: Status 404 returned error can't find the container with id 3e7fffdae0ccaec87fd4f224e4a50b5cd5dc84dfc797c5f34edca864f336bd8e Oct 06 17:10:39 crc kubenswrapper[4763]: I1006 17:10:39.575690 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:10:39 crc kubenswrapper[4763]: E1006 17:10:39.576580 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:10:39 crc kubenswrapper[4763]: I1006 17:10:39.593151 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/crc-debug-5hz96" event={"ID":"90fefb8a-8138-41bc-b71f-0599e6599fc2","Type":"ContainerStarted","Data":"0fe442d94d083c22f951922e6ede13cbd7bf3ca2954e464db865519480eb281c"} Oct 06 17:10:39 crc kubenswrapper[4763]: I1006 17:10:39.595391 4763 generic.go:334] "Generic (PLEG): container finished" podID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerID="dde94b3dec07844f6c2a1fb21dc084bc475757e6cd32b2c44b8c36ec89031d23" exitCode=0 Oct 06 17:10:39 crc kubenswrapper[4763]: I1006 17:10:39.595444 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnccv" event={"ID":"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4","Type":"ContainerDied","Data":"dde94b3dec07844f6c2a1fb21dc084bc475757e6cd32b2c44b8c36ec89031d23"} Oct 06 17:10:39 crc kubenswrapper[4763]: I1006 17:10:39.595476 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnccv" event={"ID":"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4","Type":"ContainerStarted","Data":"3e7fffdae0ccaec87fd4f224e4a50b5cd5dc84dfc797c5f34edca864f336bd8e"} Oct 06 17:10:39 crc kubenswrapper[4763]: I1006 17:10:39.616539 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vwsqt/crc-debug-5hz96" podStartSLOduration=2.713671753 podStartE2EDuration="14.616522799s" podCreationTimestamp="2025-10-06 17:10:25 +0000 UTC" firstStartedPulling="2025-10-06 17:10:26.632818475 +0000 UTC m=+8223.788110997" lastFinishedPulling="2025-10-06 17:10:38.535669531 +0000 UTC m=+8235.690962043" observedRunningTime="2025-10-06 17:10:39.608973836 +0000 UTC m=+8236.764266348" watchObservedRunningTime="2025-10-06 17:10:39.616522799 +0000 UTC m=+8236.771815311" Oct 06 17:10:41 crc kubenswrapper[4763]: I1006 17:10:41.620388 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnccv" event={"ID":"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4","Type":"ContainerStarted","Data":"c2b5ed39552e69b000e365c84c4227d18245780b4ae554cc3446a7e64a747167"} Oct 06 17:10:42 crc kubenswrapper[4763]: I1006 17:10:42.635094 4763 generic.go:334] "Generic (PLEG): container finished" podID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerID="c2b5ed39552e69b000e365c84c4227d18245780b4ae554cc3446a7e64a747167" exitCode=0 Oct 06 17:10:42 crc kubenswrapper[4763]: I1006 17:10:42.635167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnccv" event={"ID":"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4","Type":"ContainerDied","Data":"c2b5ed39552e69b000e365c84c4227d18245780b4ae554cc3446a7e64a747167"} Oct 06 17:10:44 crc kubenswrapper[4763]: I1006 17:10:44.652449 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnccv" event={"ID":"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4","Type":"ContainerStarted","Data":"5afda3e1e4a6d7461951a0cd4157878466a671c9b31ea404ae9188cdd9ee3b60"} Oct 06 17:10:44 crc kubenswrapper[4763]: I1006 17:10:44.677032 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lnccv" podStartSLOduration=4.868407798 podStartE2EDuration="9.677005447s" podCreationTimestamp="2025-10-06 17:10:35 +0000 UTC" firstStartedPulling="2025-10-06 17:10:39.597534238 +0000 UTC m=+8236.752826750" lastFinishedPulling="2025-10-06 17:10:44.406131887 +0000 UTC m=+8241.561424399" observedRunningTime="2025-10-06 17:10:44.672761833 +0000 UTC m=+8241.828054345" watchObservedRunningTime="2025-10-06 17:10:44.677005447 +0000 UTC m=+8241.832297969" Oct 06 17:10:45 crc kubenswrapper[4763]: I1006 17:10:45.947717 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:45 crc kubenswrapper[4763]: I1006 17:10:45.948092 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:47 crc kubenswrapper[4763]: I1006 17:10:47.017106 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-lnccv" podUID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerName="registry-server" probeResult="failure" output=< Oct 06 17:10:47 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Oct 06 17:10:47 crc kubenswrapper[4763]: > Oct 06 17:10:53 crc kubenswrapper[4763]: I1006 17:10:53.591302 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:10:53 crc kubenswrapper[4763]: E1006 17:10:53.592147 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.484140 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-48mwt"] Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.486976 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.497478 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48mwt"] Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.596062 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-utilities\") pod \"redhat-marketplace-48mwt\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.596107 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-catalog-content\") pod \"redhat-marketplace-48mwt\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.596138 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9rfm\" (UniqueName: \"kubernetes.io/projected/df14726f-707a-40ab-b609-72d2f6fd29f9-kube-api-access-z9rfm\") pod \"redhat-marketplace-48mwt\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.698245 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-utilities\") pod \"redhat-marketplace-48mwt\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.698290 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-catalog-content\") pod \"redhat-marketplace-48mwt\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.698324 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9rfm\" (UniqueName: \"kubernetes.io/projected/df14726f-707a-40ab-b609-72d2f6fd29f9-kube-api-access-z9rfm\") pod \"redhat-marketplace-48mwt\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.698927 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-utilities\") pod \"redhat-marketplace-48mwt\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.699007 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-catalog-content\") pod \"redhat-marketplace-48mwt\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.723208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9rfm\" (UniqueName: \"kubernetes.io/projected/df14726f-707a-40ab-b609-72d2f6fd29f9-kube-api-access-z9rfm\") pod \"redhat-marketplace-48mwt\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:55 crc kubenswrapper[4763]: I1006 17:10:55.813492 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:10:56 crc kubenswrapper[4763]: I1006 17:10:56.059241 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:56 crc kubenswrapper[4763]: I1006 17:10:56.127301 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:56 crc kubenswrapper[4763]: I1006 17:10:56.437969 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48mwt"] Oct 06 17:10:56 crc kubenswrapper[4763]: I1006 17:10:56.759307 4763 generic.go:334] "Generic (PLEG): container finished" podID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerID="edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0" exitCode=0 Oct 06 17:10:56 crc kubenswrapper[4763]: I1006 17:10:56.759358 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48mwt" event={"ID":"df14726f-707a-40ab-b609-72d2f6fd29f9","Type":"ContainerDied","Data":"edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0"} Oct 06 17:10:56 crc kubenswrapper[4763]: I1006 17:10:56.759653 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48mwt" event={"ID":"df14726f-707a-40ab-b609-72d2f6fd29f9","Type":"ContainerStarted","Data":"63a460c18cb6bbdd211a294763749268dbba42f85439e1b204f7a6b270fe5e35"} Oct 06 17:10:57 crc kubenswrapper[4763]: I1006 17:10:57.770419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48mwt" event={"ID":"df14726f-707a-40ab-b609-72d2f6fd29f9","Type":"ContainerStarted","Data":"8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742"} Oct 06 17:10:58 crc kubenswrapper[4763]: I1006 17:10:58.457839 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lnccv"] Oct 06 17:10:58 crc kubenswrapper[4763]: I1006 17:10:58.458090 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lnccv" podUID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerName="registry-server" containerID="cri-o://5afda3e1e4a6d7461951a0cd4157878466a671c9b31ea404ae9188cdd9ee3b60" gracePeriod=2 Oct 06 17:10:58 crc kubenswrapper[4763]: I1006 17:10:58.795841 4763 generic.go:334] "Generic (PLEG): container finished" podID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerID="5afda3e1e4a6d7461951a0cd4157878466a671c9b31ea404ae9188cdd9ee3b60" exitCode=0 Oct 06 17:10:58 crc kubenswrapper[4763]: I1006 17:10:58.796180 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnccv" event={"ID":"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4","Type":"ContainerDied","Data":"5afda3e1e4a6d7461951a0cd4157878466a671c9b31ea404ae9188cdd9ee3b60"} Oct 06 17:10:58 crc kubenswrapper[4763]: I1006 17:10:58.803906 4763 generic.go:334] "Generic (PLEG): container finished" podID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerID="8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742" exitCode=0 Oct 06 17:10:58 crc kubenswrapper[4763]: I1006 17:10:58.803946 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48mwt" event={"ID":"df14726f-707a-40ab-b609-72d2f6fd29f9","Type":"ContainerDied","Data":"8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742"} Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.015127 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.071358 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-utilities\") pod \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.071969 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-catalog-content\") pod \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.072103 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx2dr\" (UniqueName: \"kubernetes.io/projected/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-kube-api-access-bx2dr\") pod \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\" (UID: \"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4\") " Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.081959 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-utilities" (OuterVolumeSpecName: "utilities") pod "2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" (UID: "2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.107523 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-kube-api-access-bx2dr" (OuterVolumeSpecName: "kube-api-access-bx2dr") pod "2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" (UID: "2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4"). InnerVolumeSpecName "kube-api-access-bx2dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.122962 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" (UID: "2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.174231 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.174263 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx2dr\" (UniqueName: \"kubernetes.io/projected/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-kube-api-access-bx2dr\") on node \"crc\" DevicePath \"\"" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.174274 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.814028 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lnccv" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.814035 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnccv" event={"ID":"2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4","Type":"ContainerDied","Data":"3e7fffdae0ccaec87fd4f224e4a50b5cd5dc84dfc797c5f34edca864f336bd8e"} Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.815333 4763 scope.go:117] "RemoveContainer" containerID="5afda3e1e4a6d7461951a0cd4157878466a671c9b31ea404ae9188cdd9ee3b60" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.817514 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48mwt" event={"ID":"df14726f-707a-40ab-b609-72d2f6fd29f9","Type":"ContainerStarted","Data":"e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136"} Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.833467 4763 scope.go:117] "RemoveContainer" containerID="c2b5ed39552e69b000e365c84c4227d18245780b4ae554cc3446a7e64a747167" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.843549 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lnccv"] Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.855027 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lnccv"] Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.871361 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-48mwt" podStartSLOduration=2.296002963 podStartE2EDuration="4.871343897s" podCreationTimestamp="2025-10-06 17:10:55 +0000 UTC" firstStartedPulling="2025-10-06 17:10:56.762263969 +0000 UTC m=+8253.917556481" lastFinishedPulling="2025-10-06 17:10:59.337604903 +0000 UTC m=+8256.492897415" observedRunningTime="2025-10-06 17:10:59.862951841 +0000 UTC m=+8257.018244373" watchObservedRunningTime="2025-10-06 17:10:59.871343897 +0000 UTC m=+8257.026636409" Oct 06 17:10:59 crc kubenswrapper[4763]: I1006 17:10:59.879795 4763 scope.go:117] "RemoveContainer" containerID="dde94b3dec07844f6c2a1fb21dc084bc475757e6cd32b2c44b8c36ec89031d23" Oct 06 17:11:01 crc kubenswrapper[4763]: I1006 17:11:01.587042 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" path="/var/lib/kubelet/pods/2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4/volumes" Oct 06 17:11:05 crc kubenswrapper[4763]: I1006 17:11:05.813765 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:11:05 crc kubenswrapper[4763]: I1006 17:11:05.813987 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:11:05 crc kubenswrapper[4763]: I1006 17:11:05.876845 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:11:05 crc kubenswrapper[4763]: I1006 17:11:05.944315 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:11:06 crc kubenswrapper[4763]: I1006 17:11:06.574992 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:11:06 crc kubenswrapper[4763]: E1006 17:11:06.575915 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:11:06 crc kubenswrapper[4763]: I1006 17:11:06.810525 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48mwt"] Oct 06 17:11:07 crc kubenswrapper[4763]: I1006 17:11:07.892390 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-48mwt" podUID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerName="registry-server" containerID="cri-o://e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136" gracePeriod=2 Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.472705 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.567297 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-catalog-content\") pod \"df14726f-707a-40ab-b609-72d2f6fd29f9\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.567473 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9rfm\" (UniqueName: \"kubernetes.io/projected/df14726f-707a-40ab-b609-72d2f6fd29f9-kube-api-access-z9rfm\") pod \"df14726f-707a-40ab-b609-72d2f6fd29f9\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.567551 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-utilities\") pod \"df14726f-707a-40ab-b609-72d2f6fd29f9\" (UID: \"df14726f-707a-40ab-b609-72d2f6fd29f9\") " Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.568275 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-utilities" (OuterVolumeSpecName: "utilities") pod "df14726f-707a-40ab-b609-72d2f6fd29f9" (UID: "df14726f-707a-40ab-b609-72d2f6fd29f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.578959 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df14726f-707a-40ab-b609-72d2f6fd29f9" (UID: "df14726f-707a-40ab-b609-72d2f6fd29f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.580286 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df14726f-707a-40ab-b609-72d2f6fd29f9-kube-api-access-z9rfm" (OuterVolumeSpecName: "kube-api-access-z9rfm") pod "df14726f-707a-40ab-b609-72d2f6fd29f9" (UID: "df14726f-707a-40ab-b609-72d2f6fd29f9"). InnerVolumeSpecName "kube-api-access-z9rfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.670168 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9rfm\" (UniqueName: \"kubernetes.io/projected/df14726f-707a-40ab-b609-72d2f6fd29f9-kube-api-access-z9rfm\") on node \"crc\" DevicePath \"\"" Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.670207 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.670221 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df14726f-707a-40ab-b609-72d2f6fd29f9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.906255 4763 generic.go:334] "Generic (PLEG): container finished" podID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerID="e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136" exitCode=0 Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.906310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48mwt" event={"ID":"df14726f-707a-40ab-b609-72d2f6fd29f9","Type":"ContainerDied","Data":"e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136"} Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.906351 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48mwt" event={"ID":"df14726f-707a-40ab-b609-72d2f6fd29f9","Type":"ContainerDied","Data":"63a460c18cb6bbdd211a294763749268dbba42f85439e1b204f7a6b270fe5e35"} Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.906350 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48mwt" Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.906378 4763 scope.go:117] "RemoveContainer" containerID="e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136" Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.935114 4763 scope.go:117] "RemoveContainer" containerID="8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742" Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.969546 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48mwt"] Oct 06 17:11:08 crc kubenswrapper[4763]: I1006 17:11:08.981266 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-48mwt"] Oct 06 17:11:09 crc kubenswrapper[4763]: I1006 17:11:09.031250 4763 scope.go:117] "RemoveContainer" containerID="edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0" Oct 06 17:11:09 crc kubenswrapper[4763]: I1006 17:11:09.067783 4763 scope.go:117] "RemoveContainer" containerID="e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136" Oct 06 17:11:09 crc kubenswrapper[4763]: E1006 17:11:09.074064 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136\": container with ID starting with e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136 not found: ID does not exist" containerID="e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136" Oct 06 17:11:09 crc kubenswrapper[4763]: I1006 17:11:09.074109 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136"} err="failed to get container status \"e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136\": rpc error: code = NotFound desc = could not find container \"e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136\": container with ID starting with e28523be0f526fabc3b2be2a9095a993d92592e4f1072158d6ea64252a108136 not found: ID does not exist" Oct 06 17:11:09 crc kubenswrapper[4763]: I1006 17:11:09.074137 4763 scope.go:117] "RemoveContainer" containerID="8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742" Oct 06 17:11:09 crc kubenswrapper[4763]: E1006 17:11:09.080161 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742\": container with ID starting with 8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742 not found: ID does not exist" containerID="8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742" Oct 06 17:11:09 crc kubenswrapper[4763]: I1006 17:11:09.080198 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742"} err="failed to get container status \"8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742\": rpc error: code = NotFound desc = could not find container \"8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742\": container with ID starting with 8936f82e92a6879ac03328476d8aaaba4a30687c091c14527ab366790d6ca742 not found: ID does not exist" Oct 06 17:11:09 crc kubenswrapper[4763]: I1006 17:11:09.080223 4763 scope.go:117] "RemoveContainer" containerID="edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0" Oct 06 17:11:09 crc kubenswrapper[4763]: E1006 17:11:09.083532 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0\": container with ID starting with edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0 not found: ID does not exist" containerID="edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0" Oct 06 17:11:09 crc kubenswrapper[4763]: I1006 17:11:09.083582 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0"} err="failed to get container status \"edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0\": rpc error: code = NotFound desc = could not find container \"edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0\": container with ID starting with edec9a23b0c77e8db7a3a895a1ad2b62bd74e1e5c4c81f5427fe75afb347eaf0 not found: ID does not exist" Oct 06 17:11:09 crc kubenswrapper[4763]: I1006 17:11:09.589140 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df14726f-707a-40ab-b609-72d2f6fd29f9" path="/var/lib/kubelet/pods/df14726f-707a-40ab-b609-72d2f6fd29f9/volumes" Oct 06 17:11:17 crc kubenswrapper[4763]: I1006 17:11:17.576323 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:11:17 crc kubenswrapper[4763]: E1006 17:11:17.577001 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:11:30 crc kubenswrapper[4763]: I1006 17:11:30.575472 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:11:30 crc kubenswrapper[4763]: E1006 17:11:30.577372 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:11:41 crc kubenswrapper[4763]: I1006 17:11:41.575826 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:11:41 crc kubenswrapper[4763]: E1006 17:11:41.576658 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:11:47 crc kubenswrapper[4763]: I1006 17:11:47.963679 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_da31df66-c543-4d37-9499-4265ef5ad835/init-config-reloader/0.log" Oct 06 17:11:48 crc kubenswrapper[4763]: I1006 17:11:48.087082 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_da31df66-c543-4d37-9499-4265ef5ad835/init-config-reloader/0.log" Oct 06 17:11:48 crc kubenswrapper[4763]: I1006 17:11:48.163852 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_da31df66-c543-4d37-9499-4265ef5ad835/config-reloader/0.log" Oct 06 17:11:48 crc kubenswrapper[4763]: I1006 17:11:48.169289 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_da31df66-c543-4d37-9499-4265ef5ad835/alertmanager/0.log" Oct 06 17:11:48 crc kubenswrapper[4763]: I1006 17:11:48.360922 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_019b83d2-a0e4-439d-8df3-f62c047400fb/aodh-api/0.log" Oct 06 17:11:48 crc kubenswrapper[4763]: I1006 17:11:48.404874 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_019b83d2-a0e4-439d-8df3-f62c047400fb/aodh-evaluator/0.log" Oct 06 17:11:48 crc kubenswrapper[4763]: I1006 17:11:48.542260 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_019b83d2-a0e4-439d-8df3-f62c047400fb/aodh-notifier/0.log" Oct 06 17:11:48 crc kubenswrapper[4763]: I1006 17:11:48.597918 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_019b83d2-a0e4-439d-8df3-f62c047400fb/aodh-listener/0.log" Oct 06 17:11:48 crc kubenswrapper[4763]: I1006 17:11:48.741268 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f584d8ff6-z7w94_cbb8d436-52d9-43fa-aa95-550b13d26658/barbican-api/0.log" Oct 06 17:11:48 crc kubenswrapper[4763]: I1006 17:11:48.784764 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f584d8ff6-z7w94_cbb8d436-52d9-43fa-aa95-550b13d26658/barbican-api-log/0.log" Oct 06 17:11:48 crc kubenswrapper[4763]: I1006 17:11:48.950465 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-77fcfc7bd6-g8998_487c6f7a-74f9-4132-b3a9-5dcba1bcad30/barbican-keystone-listener/0.log" Oct 06 17:11:49 crc kubenswrapper[4763]: I1006 17:11:49.034129 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-77fcfc7bd6-g8998_487c6f7a-74f9-4132-b3a9-5dcba1bcad30/barbican-keystone-listener-log/0.log" Oct 06 17:11:49 crc kubenswrapper[4763]: I1006 17:11:49.158421 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-555d589d8f-2chkm_0747a11c-cd13-464e-a61d-4dd28334ba1d/barbican-worker/0.log" Oct 06 17:11:49 crc kubenswrapper[4763]: I1006 17:11:49.265434 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-555d589d8f-2chkm_0747a11c-cd13-464e-a61d-4dd28334ba1d/barbican-worker-log/0.log" Oct 06 17:11:49 crc kubenswrapper[4763]: I1006 17:11:49.556378 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9073dd03-373e-4003-972e-b44569066488/ceilometer-central-agent/0.log" Oct 06 17:11:49 crc kubenswrapper[4763]: I1006 17:11:49.649910 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9073dd03-373e-4003-972e-b44569066488/proxy-httpd/0.log" Oct 06 17:11:49 crc kubenswrapper[4763]: I1006 17:11:49.658599 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9073dd03-373e-4003-972e-b44569066488/ceilometer-notification-agent/0.log" Oct 06 17:11:49 crc kubenswrapper[4763]: I1006 17:11:49.784215 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9073dd03-373e-4003-972e-b44569066488/sg-core/0.log" Oct 06 17:11:49 crc kubenswrapper[4763]: I1006 17:11:49.944454 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_03ffa1c1-5baf-4f5f-8120-a9b17af29abe/cinder-api/0.log" Oct 06 17:11:50 crc kubenswrapper[4763]: I1006 17:11:50.013707 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_03ffa1c1-5baf-4f5f-8120-a9b17af29abe/cinder-api-log/0.log" Oct 06 17:11:50 crc kubenswrapper[4763]: I1006 17:11:50.307834 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_090fd341-e9a5-4d12-8aae-271b6b421647/probe/0.log" Oct 06 17:11:50 crc kubenswrapper[4763]: I1006 17:11:50.318707 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_090fd341-e9a5-4d12-8aae-271b6b421647/cinder-backup/0.log" Oct 06 17:11:50 crc kubenswrapper[4763]: I1006 17:11:50.566167 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f0720e2f-6b06-46d1-beba-86d1e81f9f9b/cinder-scheduler/0.log" Oct 06 17:11:50 crc kubenswrapper[4763]: I1006 17:11:50.588892 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f0720e2f-6b06-46d1-beba-86d1e81f9f9b/probe/0.log" Oct 06 17:11:50 crc kubenswrapper[4763]: I1006 17:11:50.808275 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_91b0955f-bf72-48f7-86de-a50ce7701fc7/probe/0.log" Oct 06 17:11:50 crc kubenswrapper[4763]: I1006 17:11:50.832774 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_91b0955f-bf72-48f7-86de-a50ce7701fc7/cinder-volume/0.log" Oct 06 17:11:50 crc kubenswrapper[4763]: I1006 17:11:50.980236 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-df8f9c6bc-tjwkx_71d20e8b-1ab2-4024-bd3c-4651186071c5/init/0.log" Oct 06 17:11:51 crc kubenswrapper[4763]: I1006 17:11:51.192983 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-df8f9c6bc-tjwkx_71d20e8b-1ab2-4024-bd3c-4651186071c5/init/0.log" Oct 06 17:11:51 crc kubenswrapper[4763]: I1006 17:11:51.209113 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-df8f9c6bc-tjwkx_71d20e8b-1ab2-4024-bd3c-4651186071c5/dnsmasq-dns/0.log" Oct 06 17:11:51 crc kubenswrapper[4763]: I1006 17:11:51.219205 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_399f4bf6-d9f3-4550-af17-4c87ebc31e30/glance-httpd/0.log" Oct 06 17:11:51 crc kubenswrapper[4763]: I1006 17:11:51.338656 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_399f4bf6-d9f3-4550-af17-4c87ebc31e30/glance-log/0.log" Oct 06 17:11:51 crc kubenswrapper[4763]: I1006 17:11:51.423865 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_944eb40b-1e91-47ef-8568-b156709d0b97/glance-log/0.log" Oct 06 17:11:51 crc kubenswrapper[4763]: I1006 17:11:51.431055 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_944eb40b-1e91-47ef-8568-b156709d0b97/glance-httpd/0.log" Oct 06 17:11:51 crc kubenswrapper[4763]: I1006 17:11:51.689499 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7d9998d569-7mv27_de6851e6-2f99-4200-8577-6b43830fe709/heat-api/0.log" Oct 06 17:11:51 crc kubenswrapper[4763]: I1006 17:11:51.738312 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-65864c57c4-kcpk5_1ae45c3e-fd98-44d2-8900-b46cc7d1428e/heat-cfnapi/0.log" Oct 06 17:11:51 crc kubenswrapper[4763]: I1006 17:11:51.863754 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-7fb6f64cd9-654qc_44cf3c94-1942-46b0-a032-0fe8427aeb43/heat-engine/0.log" Oct 06 17:11:51 crc kubenswrapper[4763]: I1006 17:11:51.983299 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d5d79488f-hhjsh_3ae65b4f-ac98-4355-9660-b0eebb531175/horizon/0.log" Oct 06 17:11:52 crc kubenswrapper[4763]: I1006 17:11:52.092707 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d5d79488f-hhjsh_3ae65b4f-ac98-4355-9660-b0eebb531175/horizon-log/0.log" Oct 06 17:11:52 crc kubenswrapper[4763]: I1006 17:11:52.308543 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b9d95cb55-7cm7p_d31fbaf3-0f27-4eec-904f-d5d23184a12d/keystone-api/0.log" Oct 06 17:11:52 crc kubenswrapper[4763]: I1006 17:11:52.309147 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29329501-pqptf_f21be0a4-cc49-4857-a4b6-77bfa60ee1b6/keystone-cron/0.log" Oct 06 17:11:52 crc kubenswrapper[4763]: I1006 17:11:52.461641 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7804f1ac-7960-4f1e-9b93-33b457147757/kube-state-metrics/0.log" Oct 06 17:11:52 crc kubenswrapper[4763]: I1006 17:11:52.583498 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_41ea1863-d069-41c1-ba7a-93d82581a18b/manila-api/0.log" Oct 06 17:11:52 crc kubenswrapper[4763]: I1006 17:11:52.590988 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_41ea1863-d069-41c1-ba7a-93d82581a18b/manila-api-log/0.log" Oct 06 17:11:52 crc kubenswrapper[4763]: I1006 17:11:52.772392 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_9af5d28a-7810-4855-8899-509a22c241da/probe/0.log" Oct 06 17:11:52 crc kubenswrapper[4763]: I1006 17:11:52.801074 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_9af5d28a-7810-4855-8899-509a22c241da/manila-scheduler/0.log" Oct 06 17:11:52 crc kubenswrapper[4763]: I1006 17:11:52.953761 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_6fa8093a-25d4-4468-ad97-c79cdc10bc71/manila-share/0.log" Oct 06 17:11:53 crc kubenswrapper[4763]: I1006 17:11:53.007166 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_6fa8093a-25d4-4468-ad97-c79cdc10bc71/probe/0.log" Oct 06 17:11:53 crc kubenswrapper[4763]: I1006 17:11:53.076221 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_85cf1d75-f8dd-4ab2-8c67-f3622e156c38/adoption/0.log" Oct 06 17:11:53 crc kubenswrapper[4763]: I1006 17:11:53.395892 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-765d7995c-g9879_74d03f25-9223-4fd7-b049-a4f63399b6c6/neutron-api/0.log" Oct 06 17:11:53 crc kubenswrapper[4763]: I1006 17:11:53.589050 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-765d7995c-g9879_74d03f25-9223-4fd7-b049-a4f63399b6c6/neutron-httpd/0.log" Oct 06 17:11:53 crc kubenswrapper[4763]: I1006 17:11:53.730143 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a00106a0-435a-4e68-854d-a810d4113012/nova-api-api/0.log" Oct 06 17:11:53 crc kubenswrapper[4763]: I1006 17:11:53.951583 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a00106a0-435a-4e68-854d-a810d4113012/nova-api-log/0.log" Oct 06 17:11:54 crc kubenswrapper[4763]: I1006 17:11:54.141087 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c1fed64c-1c98-4def-9138-53394c8a7181/nova-cell0-conductor-conductor/0.log" Oct 06 17:11:54 crc kubenswrapper[4763]: I1006 17:11:54.243569 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c8027b86-62d5-4eea-bbd4-8af38109f89f/nova-cell1-conductor-conductor/0.log" Oct 06 17:11:54 crc kubenswrapper[4763]: I1006 17:11:54.467021 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_21813bff-a137-4a98-b0fd-f5e639425cba/nova-cell1-novncproxy-novncproxy/0.log" Oct 06 17:11:54 crc kubenswrapper[4763]: I1006 17:11:54.575150 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:11:54 crc kubenswrapper[4763]: E1006 17:11:54.575490 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:11:54 crc kubenswrapper[4763]: I1006 17:11:54.686605 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_054b7d7c-25d6-4c49-bda3-b52a32ea12a0/nova-metadata-log/0.log" Oct 06 17:11:54 crc kubenswrapper[4763]: I1006 17:11:54.813419 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_054b7d7c-25d6-4c49-bda3-b52a32ea12a0/nova-metadata-metadata/0.log" Oct 06 17:11:54 crc kubenswrapper[4763]: I1006 17:11:54.913108 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_420576a9-0dd7-4f4d-a043-579ee73ffa30/memcached/0.log" Oct 06 17:11:54 crc kubenswrapper[4763]: I1006 17:11:54.972552 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_3b4e3344-ba03-4156-9ce3-6b6c2f524c5f/nova-scheduler-scheduler/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.034077 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-84d8c674c4-6d5xr_73edae2f-0842-49e6-b4ed-b05e3b60d53c/init/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.229157 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-84d8c674c4-6d5xr_73edae2f-0842-49e6-b4ed-b05e3b60d53c/init/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.301215 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-84d8c674c4-6d5xr_73edae2f-0842-49e6-b4ed-b05e3b60d53c/octavia-api-provider-agent/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.349393 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-84d8c674c4-6d5xr_73edae2f-0842-49e6-b4ed-b05e3b60d53c/octavia-api/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.449926 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-rp9zr_0287ddec-90d4-421c-b1fc-46d9f52f3b0d/init/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.620231 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-rp9zr_0287ddec-90d4-421c-b1fc-46d9f52f3b0d/init/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.654793 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-qldvm_2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9/init/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.718038 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-rp9zr_0287ddec-90d4-421c-b1fc-46d9f52f3b0d/octavia-healthmanager/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.845653 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-qldvm_2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9/init/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.889512 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-qldvm_2bb2f1c6-bb85-4065-a74d-4f6f37fb11b9/octavia-housekeeping/0.log" Oct 06 17:11:55 crc kubenswrapper[4763]: I1006 17:11:55.915847 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-s7xd5_b2686aaa-9fa1-43dd-a3af-536f1e462f28/init/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.083705 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-s7xd5_b2686aaa-9fa1-43dd-a3af-536f1e462f28/octavia-amphora-httpd/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.118155 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-s7xd5_b2686aaa-9fa1-43dd-a3af-536f1e462f28/init/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.179150 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-5tscr_5c494c30-afad-4e8e-89b7-702befa8ab06/init/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.284319 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-5tscr_5c494c30-afad-4e8e-89b7-702befa8ab06/init/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.298404 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-5tscr_5c494c30-afad-4e8e-89b7-702befa8ab06/octavia-rsyslog/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.388275 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5kqgr_87279dd8-575d-4c81-a248-ae62c52f8c24/init/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.643380 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46c419e4-1d20-4b9a-a135-022addb7e278/mysql-bootstrap/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.659421 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5kqgr_87279dd8-575d-4c81-a248-ae62c52f8c24/init/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.727202 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5kqgr_87279dd8-575d-4c81-a248-ae62c52f8c24/octavia-worker/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.891788 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46c419e4-1d20-4b9a-a135-022addb7e278/galera/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.899290 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46c419e4-1d20-4b9a-a135-022addb7e278/mysql-bootstrap/0.log" Oct 06 17:11:56 crc kubenswrapper[4763]: I1006 17:11:56.984109 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d4bc5a2-36a8-4c9f-9aad-e5b395a74954/mysql-bootstrap/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.152029 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d4bc5a2-36a8-4c9f-9aad-e5b395a74954/galera/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.158087 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d4bc5a2-36a8-4c9f-9aad-e5b395a74954/mysql-bootstrap/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.243505 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6b2385da-75ff-4a56-baf3-9632066140c6/openstackclient/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.410079 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bjmp2_1655bf62-c30d-4f1b-9c19-0fae5ec2a8da/openstack-network-exporter/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.493982 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-msztv_661e44f6-cd55-465f-beec-5061aa883c44/ovsdb-server-init/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.629219 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-msztv_661e44f6-cd55-465f-beec-5061aa883c44/ovsdb-server/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.654079 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-msztv_661e44f6-cd55-465f-beec-5061aa883c44/ovs-vswitchd/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.658419 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-msztv_661e44f6-cd55-465f-beec-5061aa883c44/ovsdb-server-init/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.798219 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-q54nb_50642abe-8228-47b7-9690-67d289d195a9/ovn-controller/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.839162 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_dec9fc58-5530-46e0-8518-edd126a266f8/adoption/0.log" Oct 06 17:11:57 crc kubenswrapper[4763]: I1006 17:11:57.985935 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c63c57da-3a5e-4144-a114-1089ea5f4ed6/openstack-network-exporter/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.055095 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c63c57da-3a5e-4144-a114-1089ea5f4ed6/ovn-northd/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.333249 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0e16abe9-25ba-4111-93f8-73a9dddcb7e3/ovsdbserver-nb/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.375723 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0e16abe9-25ba-4111-93f8-73a9dddcb7e3/openstack-network-exporter/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.506037 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_e5065bdf-e83a-427c-a0cb-c4eaae128dcd/openstack-network-exporter/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.509426 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_e5065bdf-e83a-427c-a0cb-c4eaae128dcd/ovsdbserver-nb/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.644984 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_3c3e6677-0111-41dd-90c8-46dee432289f/openstack-network-exporter/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.664466 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_3c3e6677-0111-41dd-90c8-46dee432289f/ovsdbserver-nb/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.780491 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a9063050-15eb-4a66-adfb-846f24c6c9cb/openstack-network-exporter/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.803973 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a9063050-15eb-4a66-adfb-846f24c6c9cb/ovsdbserver-sb/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.869755 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_aa9ec132-af6a-4109-a05a-d492114d1f52/openstack-network-exporter/0.log" Oct 06 17:11:58 crc kubenswrapper[4763]: I1006 17:11:58.964377 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_aa9ec132-af6a-4109-a05a-d492114d1f52/ovsdbserver-sb/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.019765 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_486912e6-bb8d-4973-8193-da8a59e0d4c9/openstack-network-exporter/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.061201 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_486912e6-bb8d-4973-8193-da8a59e0d4c9/ovsdbserver-sb/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.199816 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-fc4965fb-jjbpd_c5831c20-f18f-4e98-9001-80e29602b3a1/placement-api/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.237640 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-fc4965fb-jjbpd_c5831c20-f18f-4e98-9001-80e29602b3a1/placement-log/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.361125 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bb77f67f-1b78-4be2-be5b-bae817e4cf46/init-config-reloader/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.535716 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bb77f67f-1b78-4be2-be5b-bae817e4cf46/config-reloader/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.537606 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bb77f67f-1b78-4be2-be5b-bae817e4cf46/init-config-reloader/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.567163 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bb77f67f-1b78-4be2-be5b-bae817e4cf46/prometheus/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.597379 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bb77f67f-1b78-4be2-be5b-bae817e4cf46/thanos-sidecar/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.761958 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5b5a728e-4fec-45c7-866e-a8dd895c6a2b/setup-container/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.892653 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5b5a728e-4fec-45c7-866e-a8dd895c6a2b/setup-container/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.933939 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5b5a728e-4fec-45c7-866e-a8dd895c6a2b/rabbitmq/0.log" Oct 06 17:11:59 crc kubenswrapper[4763]: I1006 17:11:59.968810 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b80f7807-fbc1-49a4-9792-293545704695/setup-container/0.log" Oct 06 17:12:00 crc kubenswrapper[4763]: I1006 17:12:00.148358 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b80f7807-fbc1-49a4-9792-293545704695/rabbitmq/0.log" Oct 06 17:12:00 crc kubenswrapper[4763]: I1006 17:12:00.148870 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b80f7807-fbc1-49a4-9792-293545704695/setup-container/0.log" Oct 06 17:12:06 crc kubenswrapper[4763]: I1006 17:12:06.575508 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:12:06 crc kubenswrapper[4763]: E1006 17:12:06.576356 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:12:19 crc kubenswrapper[4763]: I1006 17:12:19.575758 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:12:19 crc kubenswrapper[4763]: E1006 17:12:19.577660 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:12:33 crc kubenswrapper[4763]: I1006 17:12:33.583931 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:12:33 crc kubenswrapper[4763]: E1006 17:12:33.584728 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:12:39 crc kubenswrapper[4763]: I1006 17:12:39.893998 4763 generic.go:334] "Generic (PLEG): container finished" podID="90fefb8a-8138-41bc-b71f-0599e6599fc2" containerID="0fe442d94d083c22f951922e6ede13cbd7bf3ca2954e464db865519480eb281c" exitCode=0 Oct 06 17:12:39 crc kubenswrapper[4763]: I1006 17:12:39.894250 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/crc-debug-5hz96" event={"ID":"90fefb8a-8138-41bc-b71f-0599e6599fc2","Type":"ContainerDied","Data":"0fe442d94d083c22f951922e6ede13cbd7bf3ca2954e464db865519480eb281c"} Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.076812 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-5hz96" Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.124042 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vwsqt/crc-debug-5hz96"] Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.130695 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vwsqt/crc-debug-5hz96"] Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.224250 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90fefb8a-8138-41bc-b71f-0599e6599fc2-host\") pod \"90fefb8a-8138-41bc-b71f-0599e6599fc2\" (UID: \"90fefb8a-8138-41bc-b71f-0599e6599fc2\") " Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.224521 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90fefb8a-8138-41bc-b71f-0599e6599fc2-host" (OuterVolumeSpecName: "host") pod "90fefb8a-8138-41bc-b71f-0599e6599fc2" (UID: "90fefb8a-8138-41bc-b71f-0599e6599fc2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.225220 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdnn9\" (UniqueName: \"kubernetes.io/projected/90fefb8a-8138-41bc-b71f-0599e6599fc2-kube-api-access-vdnn9\") pod \"90fefb8a-8138-41bc-b71f-0599e6599fc2\" (UID: \"90fefb8a-8138-41bc-b71f-0599e6599fc2\") " Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.226573 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90fefb8a-8138-41bc-b71f-0599e6599fc2-host\") on node \"crc\" DevicePath \"\"" Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.253915 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90fefb8a-8138-41bc-b71f-0599e6599fc2-kube-api-access-vdnn9" (OuterVolumeSpecName: "kube-api-access-vdnn9") pod "90fefb8a-8138-41bc-b71f-0599e6599fc2" (UID: "90fefb8a-8138-41bc-b71f-0599e6599fc2"). InnerVolumeSpecName "kube-api-access-vdnn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.329866 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdnn9\" (UniqueName: \"kubernetes.io/projected/90fefb8a-8138-41bc-b71f-0599e6599fc2-kube-api-access-vdnn9\") on node \"crc\" DevicePath \"\"" Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.601752 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90fefb8a-8138-41bc-b71f-0599e6599fc2" path="/var/lib/kubelet/pods/90fefb8a-8138-41bc-b71f-0599e6599fc2/volumes" Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.925015 4763 scope.go:117] "RemoveContainer" containerID="0fe442d94d083c22f951922e6ede13cbd7bf3ca2954e464db865519480eb281c" Oct 06 17:12:41 crc kubenswrapper[4763]: I1006 17:12:41.925111 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-5hz96" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.360195 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vwsqt/crc-debug-gn97v"] Oct 06 17:12:42 crc kubenswrapper[4763]: E1006 17:12:42.360634 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fefb8a-8138-41bc-b71f-0599e6599fc2" containerName="container-00" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.360648 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fefb8a-8138-41bc-b71f-0599e6599fc2" containerName="container-00" Oct 06 17:12:42 crc kubenswrapper[4763]: E1006 17:12:42.360675 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerName="extract-utilities" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.360681 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerName="extract-utilities" Oct 06 17:12:42 crc kubenswrapper[4763]: E1006 17:12:42.360691 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerName="extract-content" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.360697 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerName="extract-content" Oct 06 17:12:42 crc kubenswrapper[4763]: E1006 17:12:42.360714 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerName="extract-content" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.360720 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerName="extract-content" Oct 06 17:12:42 crc kubenswrapper[4763]: E1006 17:12:42.360726 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerName="registry-server" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.360732 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerName="registry-server" Oct 06 17:12:42 crc kubenswrapper[4763]: E1006 17:12:42.360746 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerName="extract-utilities" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.360751 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerName="extract-utilities" Oct 06 17:12:42 crc kubenswrapper[4763]: E1006 17:12:42.360763 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerName="registry-server" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.360768 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerName="registry-server" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.360980 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5ecb52-a6c3-4381-b79f-5ff5ab1f1bf4" containerName="registry-server" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.360995 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="90fefb8a-8138-41bc-b71f-0599e6599fc2" containerName="container-00" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.361014 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="df14726f-707a-40ab-b609-72d2f6fd29f9" containerName="registry-server" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.361740 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-gn97v" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.458982 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b24hg\" (UniqueName: \"kubernetes.io/projected/bf6f341d-9a30-440a-9b1e-87ae8a21253c-kube-api-access-b24hg\") pod \"crc-debug-gn97v\" (UID: \"bf6f341d-9a30-440a-9b1e-87ae8a21253c\") " pod="openshift-must-gather-vwsqt/crc-debug-gn97v" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.459292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf6f341d-9a30-440a-9b1e-87ae8a21253c-host\") pod \"crc-debug-gn97v\" (UID: \"bf6f341d-9a30-440a-9b1e-87ae8a21253c\") " pod="openshift-must-gather-vwsqt/crc-debug-gn97v" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.561042 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf6f341d-9a30-440a-9b1e-87ae8a21253c-host\") pod \"crc-debug-gn97v\" (UID: \"bf6f341d-9a30-440a-9b1e-87ae8a21253c\") " pod="openshift-must-gather-vwsqt/crc-debug-gn97v" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.561175 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf6f341d-9a30-440a-9b1e-87ae8a21253c-host\") pod \"crc-debug-gn97v\" (UID: \"bf6f341d-9a30-440a-9b1e-87ae8a21253c\") " pod="openshift-must-gather-vwsqt/crc-debug-gn97v" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.561578 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b24hg\" (UniqueName: \"kubernetes.io/projected/bf6f341d-9a30-440a-9b1e-87ae8a21253c-kube-api-access-b24hg\") pod \"crc-debug-gn97v\" (UID: \"bf6f341d-9a30-440a-9b1e-87ae8a21253c\") " pod="openshift-must-gather-vwsqt/crc-debug-gn97v" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.594812 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b24hg\" (UniqueName: \"kubernetes.io/projected/bf6f341d-9a30-440a-9b1e-87ae8a21253c-kube-api-access-b24hg\") pod \"crc-debug-gn97v\" (UID: \"bf6f341d-9a30-440a-9b1e-87ae8a21253c\") " pod="openshift-must-gather-vwsqt/crc-debug-gn97v" Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.679769 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-gn97v" Oct 06 17:12:42 crc kubenswrapper[4763]: W1006 17:12:42.744069 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf6f341d_9a30_440a_9b1e_87ae8a21253c.slice/crio-9c819ca4b60087c398be1b5e8fb7da60a0963760d8896b425072f9ea92b8ee5a WatchSource:0}: Error finding container 9c819ca4b60087c398be1b5e8fb7da60a0963760d8896b425072f9ea92b8ee5a: Status 404 returned error can't find the container with id 9c819ca4b60087c398be1b5e8fb7da60a0963760d8896b425072f9ea92b8ee5a Oct 06 17:12:42 crc kubenswrapper[4763]: I1006 17:12:42.939438 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/crc-debug-gn97v" event={"ID":"bf6f341d-9a30-440a-9b1e-87ae8a21253c","Type":"ContainerStarted","Data":"9c819ca4b60087c398be1b5e8fb7da60a0963760d8896b425072f9ea92b8ee5a"} Oct 06 17:12:43 crc kubenswrapper[4763]: I1006 17:12:43.959434 4763 generic.go:334] "Generic (PLEG): container finished" podID="bf6f341d-9a30-440a-9b1e-87ae8a21253c" containerID="1623aaf140c718ca4d871f59ee5e16a8a9b6203a81a1e265200b781ef34d3146" exitCode=0 Oct 06 17:12:43 crc kubenswrapper[4763]: I1006 17:12:43.959579 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/crc-debug-gn97v" event={"ID":"bf6f341d-9a30-440a-9b1e-87ae8a21253c","Type":"ContainerDied","Data":"1623aaf140c718ca4d871f59ee5e16a8a9b6203a81a1e265200b781ef34d3146"} Oct 06 17:12:45 crc kubenswrapper[4763]: I1006 17:12:45.078281 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-gn97v" Oct 06 17:12:45 crc kubenswrapper[4763]: I1006 17:12:45.118817 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf6f341d-9a30-440a-9b1e-87ae8a21253c-host\") pod \"bf6f341d-9a30-440a-9b1e-87ae8a21253c\" (UID: \"bf6f341d-9a30-440a-9b1e-87ae8a21253c\") " Oct 06 17:12:45 crc kubenswrapper[4763]: I1006 17:12:45.119118 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b24hg\" (UniqueName: \"kubernetes.io/projected/bf6f341d-9a30-440a-9b1e-87ae8a21253c-kube-api-access-b24hg\") pod \"bf6f341d-9a30-440a-9b1e-87ae8a21253c\" (UID: \"bf6f341d-9a30-440a-9b1e-87ae8a21253c\") " Oct 06 17:12:45 crc kubenswrapper[4763]: I1006 17:12:45.118940 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf6f341d-9a30-440a-9b1e-87ae8a21253c-host" (OuterVolumeSpecName: "host") pod "bf6f341d-9a30-440a-9b1e-87ae8a21253c" (UID: "bf6f341d-9a30-440a-9b1e-87ae8a21253c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 17:12:45 crc kubenswrapper[4763]: I1006 17:12:45.120292 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf6f341d-9a30-440a-9b1e-87ae8a21253c-host\") on node \"crc\" DevicePath \"\"" Oct 06 17:12:45 crc kubenswrapper[4763]: I1006 17:12:45.126913 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6f341d-9a30-440a-9b1e-87ae8a21253c-kube-api-access-b24hg" (OuterVolumeSpecName: "kube-api-access-b24hg") pod "bf6f341d-9a30-440a-9b1e-87ae8a21253c" (UID: "bf6f341d-9a30-440a-9b1e-87ae8a21253c"). InnerVolumeSpecName "kube-api-access-b24hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:12:45 crc kubenswrapper[4763]: I1006 17:12:45.222331 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b24hg\" (UniqueName: \"kubernetes.io/projected/bf6f341d-9a30-440a-9b1e-87ae8a21253c-kube-api-access-b24hg\") on node \"crc\" DevicePath \"\"" Oct 06 17:12:45 crc kubenswrapper[4763]: I1006 17:12:45.976726 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/crc-debug-gn97v" event={"ID":"bf6f341d-9a30-440a-9b1e-87ae8a21253c","Type":"ContainerDied","Data":"9c819ca4b60087c398be1b5e8fb7da60a0963760d8896b425072f9ea92b8ee5a"} Oct 06 17:12:45 crc kubenswrapper[4763]: I1006 17:12:45.976766 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c819ca4b60087c398be1b5e8fb7da60a0963760d8896b425072f9ea92b8ee5a" Oct 06 17:12:45 crc kubenswrapper[4763]: I1006 17:12:45.976794 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-gn97v" Oct 06 17:12:47 crc kubenswrapper[4763]: I1006 17:12:47.574819 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:12:47 crc kubenswrapper[4763]: E1006 17:12:47.575846 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:12:53 crc kubenswrapper[4763]: I1006 17:12:53.723039 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vwsqt/crc-debug-gn97v"] Oct 06 17:12:53 crc kubenswrapper[4763]: I1006 17:12:53.733080 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vwsqt/crc-debug-gn97v"] Oct 06 17:12:54 crc kubenswrapper[4763]: I1006 17:12:54.886877 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vwsqt/crc-debug-wfmdj"] Oct 06 17:12:54 crc kubenswrapper[4763]: E1006 17:12:54.887836 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6f341d-9a30-440a-9b1e-87ae8a21253c" containerName="container-00" Oct 06 17:12:54 crc kubenswrapper[4763]: I1006 17:12:54.887855 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6f341d-9a30-440a-9b1e-87ae8a21253c" containerName="container-00" Oct 06 17:12:54 crc kubenswrapper[4763]: I1006 17:12:54.888108 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6f341d-9a30-440a-9b1e-87ae8a21253c" containerName="container-00" Oct 06 17:12:54 crc kubenswrapper[4763]: I1006 17:12:54.889053 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" Oct 06 17:12:54 crc kubenswrapper[4763]: I1006 17:12:54.993697 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmb7n\" (UniqueName: \"kubernetes.io/projected/ddbe1369-03e3-4863-a453-74f4b5ab6403-kube-api-access-fmb7n\") pod \"crc-debug-wfmdj\" (UID: \"ddbe1369-03e3-4863-a453-74f4b5ab6403\") " pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" Oct 06 17:12:54 crc kubenswrapper[4763]: I1006 17:12:54.993923 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddbe1369-03e3-4863-a453-74f4b5ab6403-host\") pod \"crc-debug-wfmdj\" (UID: \"ddbe1369-03e3-4863-a453-74f4b5ab6403\") " pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" Oct 06 17:12:55 crc kubenswrapper[4763]: I1006 17:12:55.096495 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddbe1369-03e3-4863-a453-74f4b5ab6403-host\") pod \"crc-debug-wfmdj\" (UID: \"ddbe1369-03e3-4863-a453-74f4b5ab6403\") " pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" Oct 06 17:12:55 crc kubenswrapper[4763]: I1006 17:12:55.096653 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddbe1369-03e3-4863-a453-74f4b5ab6403-host\") pod \"crc-debug-wfmdj\" (UID: \"ddbe1369-03e3-4863-a453-74f4b5ab6403\") " pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" Oct 06 17:12:55 crc kubenswrapper[4763]: I1006 17:12:55.096863 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmb7n\" (UniqueName: \"kubernetes.io/projected/ddbe1369-03e3-4863-a453-74f4b5ab6403-kube-api-access-fmb7n\") pod \"crc-debug-wfmdj\" (UID: \"ddbe1369-03e3-4863-a453-74f4b5ab6403\") " pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" Oct 06 17:12:55 crc kubenswrapper[4763]: I1006 17:12:55.129907 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmb7n\" (UniqueName: \"kubernetes.io/projected/ddbe1369-03e3-4863-a453-74f4b5ab6403-kube-api-access-fmb7n\") pod \"crc-debug-wfmdj\" (UID: \"ddbe1369-03e3-4863-a453-74f4b5ab6403\") " pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" Oct 06 17:12:55 crc kubenswrapper[4763]: I1006 17:12:55.214860 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" Oct 06 17:12:55 crc kubenswrapper[4763]: I1006 17:12:55.591571 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf6f341d-9a30-440a-9b1e-87ae8a21253c" path="/var/lib/kubelet/pods/bf6f341d-9a30-440a-9b1e-87ae8a21253c/volumes" Oct 06 17:12:56 crc kubenswrapper[4763]: I1006 17:12:56.084667 4763 generic.go:334] "Generic (PLEG): container finished" podID="ddbe1369-03e3-4863-a453-74f4b5ab6403" containerID="9b7949bae5f39becbe2fe8e9f49c0cee5f7943731209dfa5bb7460d79dc43c30" exitCode=0 Oct 06 17:12:56 crc kubenswrapper[4763]: I1006 17:12:56.084773 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" event={"ID":"ddbe1369-03e3-4863-a453-74f4b5ab6403","Type":"ContainerDied","Data":"9b7949bae5f39becbe2fe8e9f49c0cee5f7943731209dfa5bb7460d79dc43c30"} Oct 06 17:12:56 crc kubenswrapper[4763]: I1006 17:12:56.085052 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" event={"ID":"ddbe1369-03e3-4863-a453-74f4b5ab6403","Type":"ContainerStarted","Data":"8a850adbb13de8b9b75f7491d184e3a37f9d0337c23e8c358bf853cb88cdaa61"} Oct 06 17:12:56 crc kubenswrapper[4763]: I1006 17:12:56.139954 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vwsqt/crc-debug-wfmdj"] Oct 06 17:12:56 crc kubenswrapper[4763]: I1006 17:12:56.151838 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vwsqt/crc-debug-wfmdj"] Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.229605 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.346842 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmb7n\" (UniqueName: \"kubernetes.io/projected/ddbe1369-03e3-4863-a453-74f4b5ab6403-kube-api-access-fmb7n\") pod \"ddbe1369-03e3-4863-a453-74f4b5ab6403\" (UID: \"ddbe1369-03e3-4863-a453-74f4b5ab6403\") " Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.347094 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddbe1369-03e3-4863-a453-74f4b5ab6403-host\") pod \"ddbe1369-03e3-4863-a453-74f4b5ab6403\" (UID: \"ddbe1369-03e3-4863-a453-74f4b5ab6403\") " Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.347577 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ddbe1369-03e3-4863-a453-74f4b5ab6403-host" (OuterVolumeSpecName: "host") pod "ddbe1369-03e3-4863-a453-74f4b5ab6403" (UID: "ddbe1369-03e3-4863-a453-74f4b5ab6403"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.368440 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddbe1369-03e3-4863-a453-74f4b5ab6403-kube-api-access-fmb7n" (OuterVolumeSpecName: "kube-api-access-fmb7n") pod "ddbe1369-03e3-4863-a453-74f4b5ab6403" (UID: "ddbe1369-03e3-4863-a453-74f4b5ab6403"). InnerVolumeSpecName "kube-api-access-fmb7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.449765 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddbe1369-03e3-4863-a453-74f4b5ab6403-host\") on node \"crc\" DevicePath \"\"" Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.449815 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmb7n\" (UniqueName: \"kubernetes.io/projected/ddbe1369-03e3-4863-a453-74f4b5ab6403-kube-api-access-fmb7n\") on node \"crc\" DevicePath \"\"" Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.585181 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddbe1369-03e3-4863-a453-74f4b5ab6403" path="/var/lib/kubelet/pods/ddbe1369-03e3-4863-a453-74f4b5ab6403/volumes" Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.736754 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs_6e112159-8d72-4a19-9162-619d8f9bfa45/util/0.log" Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.922257 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs_6e112159-8d72-4a19-9162-619d8f9bfa45/pull/0.log" Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.948672 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs_6e112159-8d72-4a19-9162-619d8f9bfa45/util/0.log" Oct 06 17:12:57 crc kubenswrapper[4763]: I1006 17:12:57.958826 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs_6e112159-8d72-4a19-9162-619d8f9bfa45/pull/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.110684 4763 scope.go:117] "RemoveContainer" containerID="9b7949bae5f39becbe2fe8e9f49c0cee5f7943731209dfa5bb7460d79dc43c30" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.110720 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/crc-debug-wfmdj" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.166079 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs_6e112159-8d72-4a19-9162-619d8f9bfa45/pull/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.175638 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs_6e112159-8d72-4a19-9162-619d8f9bfa45/util/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.199074 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0fb7be3d855955d7072cc0fb22ee445dab10bffba7261b457a5d424b63h9jjs_6e112159-8d72-4a19-9162-619d8f9bfa45/extract/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.343865 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-ldl9r_dd4eb024-6f04-4ee5-a485-78311ddee488/kube-rbac-proxy/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.425521 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-ctdbs_f76d9cdd-3955-4727-87b4-20bf248be0f2/kube-rbac-proxy/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.439015 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-ldl9r_dd4eb024-6f04-4ee5-a485-78311ddee488/manager/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.572668 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-ctdbs_f76d9cdd-3955-4727-87b4-20bf248be0f2/manager/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.589997 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-2k5df_2c0fa69d-0c27-48bc-b120-e1416445de40/kube-rbac-proxy/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.638769 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-2k5df_2c0fa69d-0c27-48bc-b120-e1416445de40/manager/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.731806 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-cwcd8_64a78de0-4ad4-40f2-bf1c-a830f4c32dd1/kube-rbac-proxy/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.911855 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-cwcd8_64a78de0-4ad4-40f2-bf1c-a830f4c32dd1/manager/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.962262 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-pc484_4a94b2e1-c815-45d3-b7ac-aa7b632af0ef/manager/0.log" Oct 06 17:12:58 crc kubenswrapper[4763]: I1006 17:12:58.965061 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-pc484_4a94b2e1-c815-45d3-b7ac-aa7b632af0ef/kube-rbac-proxy/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.079822 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-s54qg_e5c474ec-88f8-4c60-a866-fcc48cf5bcec/kube-rbac-proxy/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.210761 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-s54qg_e5c474ec-88f8-4c60-a866-fcc48cf5bcec/manager/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.227856 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-625x7_1e501127-6403-4475-896e-52efee97894e/kube-rbac-proxy/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.492226 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-x7zlp_97d973de-bf67-44c5-b76c-dcb36cab65b4/manager/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.493179 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-x7zlp_97d973de-bf67-44c5-b76c-dcb36cab65b4/kube-rbac-proxy/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.503349 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-625x7_1e501127-6403-4475-896e-52efee97894e/manager/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.639167 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-7hqwv_0c3ffc96-39cf-4c8d-9b14-90293eabb117/kube-rbac-proxy/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.792070 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-7hqwv_0c3ffc96-39cf-4c8d-9b14-90293eabb117/manager/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.809233 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-lwhfc_af8be53e-818e-4c1e-a90c-800b02a679e4/kube-rbac-proxy/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.862642 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-lwhfc_af8be53e-818e-4c1e-a90c-800b02a679e4/manager/0.log" Oct 06 17:12:59 crc kubenswrapper[4763]: I1006 17:12:59.983555 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd_96be297b-bf5c-4abe-853f-8562201ec721/kube-rbac-proxy/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.020262 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-5hpxd_96be297b-bf5c-4abe-853f-8562201ec721/manager/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.091741 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-dp6zq_5f6171a0-9860-4ea2-b850-dc4a67f25499/kube-rbac-proxy/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.200054 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-dp6zq_5f6171a0-9860-4ea2-b850-dc4a67f25499/manager/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.254111 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-m4l5b_a0c790cf-619e-4986-8dbb-f9ed71be08c7/kube-rbac-proxy/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.433049 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-m4l5b_a0c790cf-619e-4986-8dbb-f9ed71be08c7/manager/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.542599 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-th4tv_216ab588-0962-451e-8481-2dff1de05a59/kube-rbac-proxy/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.580133 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-th4tv_216ab588-0962-451e-8481-2dff1de05a59/manager/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.651684 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn_be5ee6ad-dda0-4cb8-8c8a-e8051083873a/kube-rbac-proxy/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.688887 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cnmbgn_be5ee6ad-dda0-4cb8-8c8a-e8051083873a/manager/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.778501 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-847bc59d9d-vvb5x_4e80e79e-1bcd-40f5-a520-f8a850bd5cf5/kube-rbac-proxy/0.log" Oct 06 17:13:00 crc kubenswrapper[4763]: I1006 17:13:00.909584 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6bbd86684c-ss5vd_ff1c32b8-62e7-4e36-ba7f-39fe0d187db3/kube-rbac-proxy/0.log" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.062907 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6qtzn_cc23101a-9ccd-42c8-bcbc-3a510205301e/registry-server/0.log" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.085533 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6bbd86684c-ss5vd_ff1c32b8-62e7-4e36-ba7f-39fe0d187db3/operator/0.log" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.231150 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-zlwcn_4afc1cb2-2dde-439a-88ea-8aaeedbadc53/kube-rbac-proxy/0.log" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.363108 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-hbvnb_d7407f54-5a8a-4aa2-a65c-5cba9feedd94/kube-rbac-proxy/0.log" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.398193 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-zlwcn_4afc1cb2-2dde-439a-88ea-8aaeedbadc53/manager/0.log" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.502145 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-hbvnb_d7407f54-5a8a-4aa2-a65c-5cba9feedd94/manager/0.log" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.574699 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:13:01 crc kubenswrapper[4763]: E1006 17:13:01.575001 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.608702 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-69cq6_c48055a7-5392-47b3-a30c-f2f97c8463cc/operator/0.log" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.746085 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-brpsw_1e2641ba-13ac-4406-a4a7-79d2a4e7bafd/kube-rbac-proxy/0.log" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.835949 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-brpsw_1e2641ba-13ac-4406-a4a7-79d2a4e7bafd/manager/0.log" Oct 06 17:13:01 crc kubenswrapper[4763]: I1006 17:13:01.873439 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-nnrgf_b754fcb6-88a9-42a9-9cfc-e624fe6d1afa/kube-rbac-proxy/0.log" Oct 06 17:13:02 crc kubenswrapper[4763]: I1006 17:13:02.039402 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-w99ph_7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f/kube-rbac-proxy/0.log" Oct 06 17:13:02 crc kubenswrapper[4763]: I1006 17:13:02.140637 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-w99ph_7e66e7e7-1f92-49f9-b27f-a2e0c7ec568f/manager/0.log" Oct 06 17:13:02 crc kubenswrapper[4763]: I1006 17:13:02.192505 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-nnrgf_b754fcb6-88a9-42a9-9cfc-e624fe6d1afa/manager/0.log" Oct 06 17:13:02 crc kubenswrapper[4763]: I1006 17:13:02.210601 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-jsb9x_948b7d9f-215b-47d3-b8dc-a953d26e9041/kube-rbac-proxy/0.log" Oct 06 17:13:02 crc kubenswrapper[4763]: I1006 17:13:02.365718 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-jsb9x_948b7d9f-215b-47d3-b8dc-a953d26e9041/manager/0.log" Oct 06 17:13:02 crc kubenswrapper[4763]: I1006 17:13:02.941845 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-847bc59d9d-vvb5x_4e80e79e-1bcd-40f5-a520-f8a850bd5cf5/manager/0.log" Oct 06 17:13:15 crc kubenswrapper[4763]: I1006 17:13:15.575462 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:13:15 crc kubenswrapper[4763]: E1006 17:13:15.577532 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:13:18 crc kubenswrapper[4763]: I1006 17:13:18.057531 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-c8t2s_519076e5-c5f8-4122-aa97-7941d40204dc/control-plane-machine-set-operator/0.log" Oct 06 17:13:18 crc kubenswrapper[4763]: I1006 17:13:18.239150 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wqpfm_bd26b8e0-dc7c-4d93-bb0f-3bf9de025430/kube-rbac-proxy/0.log" Oct 06 17:13:18 crc kubenswrapper[4763]: I1006 17:13:18.263118 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wqpfm_bd26b8e0-dc7c-4d93-bb0f-3bf9de025430/machine-api-operator/0.log" Oct 06 17:13:28 crc kubenswrapper[4763]: I1006 17:13:28.575231 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:13:28 crc kubenswrapper[4763]: E1006 17:13:28.576072 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:13:30 crc kubenswrapper[4763]: I1006 17:13:30.502321 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-zm2kr_f65abbd1-72e7-4956-b93a-10f9957210fe/cert-manager-controller/0.log" Oct 06 17:13:30 crc kubenswrapper[4763]: I1006 17:13:30.694459 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-vg7rr_065553f7-5ee3-4c9f-ac08-a22904cfd751/cert-manager-cainjector/0.log" Oct 06 17:13:30 crc kubenswrapper[4763]: I1006 17:13:30.754693 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-rwjdz_20b26141-5d1b-4c8f-97fb-626f669768b1/cert-manager-webhook/0.log" Oct 06 17:13:41 crc kubenswrapper[4763]: I1006 17:13:41.575042 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:13:41 crc kubenswrapper[4763]: E1006 17:13:41.576191 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:13:43 crc kubenswrapper[4763]: I1006 17:13:43.217580 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-nbrs9_9f18885b-ea4b-4790-9482-0d327c1873b3/nmstate-console-plugin/0.log" Oct 06 17:13:43 crc kubenswrapper[4763]: I1006 17:13:43.372868 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-s4ftg_6b03334e-a7ad-4cdd-9d7e-e1f8ab9686ef/nmstate-handler/0.log" Oct 06 17:13:43 crc kubenswrapper[4763]: I1006 17:13:43.437863 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-kvgpn_5f21f303-50e0-4f36-aa64-53c5ae4f27c0/nmstate-metrics/0.log" Oct 06 17:13:43 crc kubenswrapper[4763]: I1006 17:13:43.441409 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-kvgpn_5f21f303-50e0-4f36-aa64-53c5ae4f27c0/kube-rbac-proxy/0.log" Oct 06 17:13:43 crc kubenswrapper[4763]: I1006 17:13:43.606695 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-5plp9_1e655810-1b15-490a-ac92-54845114c6f6/nmstate-operator/0.log" Oct 06 17:13:43 crc kubenswrapper[4763]: I1006 17:13:43.684766 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-vvbkc_a8dc2d16-42c4-438c-930b-c6de9707aa93/nmstate-webhook/0.log" Oct 06 17:13:54 crc kubenswrapper[4763]: I1006 17:13:54.575168 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:13:54 crc kubenswrapper[4763]: E1006 17:13:54.575840 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:13:58 crc kubenswrapper[4763]: I1006 17:13:58.580898 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-tfsjz_52461501-c1a5-4e78-8ed6-2d33efa547bf/kube-rbac-proxy/0.log" Oct 06 17:13:58 crc kubenswrapper[4763]: I1006 17:13:58.834513 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-zc5sb_897ea117-5211-437f-8ef9-206a43a00850/frr-k8s-webhook-server/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.078907 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-tfsjz_52461501-c1a5-4e78-8ed6-2d33efa547bf/controller/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.085860 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-frr-files/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.246561 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-frr-files/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.263847 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-reloader/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.297155 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-metrics/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.309664 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-reloader/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.539492 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-reloader/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.562882 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-frr-files/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.584788 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-metrics/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.584974 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-metrics/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.761006 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-reloader/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.789560 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-metrics/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.810434 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/cp-frr-files/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.815764 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/controller/0.log" Oct 06 17:13:59 crc kubenswrapper[4763]: I1006 17:13:59.956287 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/frr-metrics/0.log" Oct 06 17:14:00 crc kubenswrapper[4763]: I1006 17:14:00.003838 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/kube-rbac-proxy/0.log" Oct 06 17:14:00 crc kubenswrapper[4763]: I1006 17:14:00.100960 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/kube-rbac-proxy-frr/0.log" Oct 06 17:14:00 crc kubenswrapper[4763]: I1006 17:14:00.222436 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/reloader/0.log" Oct 06 17:14:00 crc kubenswrapper[4763]: I1006 17:14:00.363601 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5588db756d-q7lc2_728f41e8-a80b-4512-b663-fa525ae11afd/manager/0.log" Oct 06 17:14:00 crc kubenswrapper[4763]: I1006 17:14:00.485441 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-f4bd4784-h52rl_e5bf63ac-288e-44d9-a8cb-5f8ade87c0e8/webhook-server/0.log" Oct 06 17:14:00 crc kubenswrapper[4763]: I1006 17:14:00.757499 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rlg6x_f153a5a1-097a-4743-91df-8aad1eb19149/kube-rbac-proxy/0.log" Oct 06 17:14:01 crc kubenswrapper[4763]: I1006 17:14:01.554040 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rlg6x_f153a5a1-097a-4743-91df-8aad1eb19149/speaker/0.log" Oct 06 17:14:02 crc kubenswrapper[4763]: I1006 17:14:02.701442 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zgs9f_88887d3f-347d-41bd-9e25-1dcfa97b2175/frr/0.log" Oct 06 17:14:06 crc kubenswrapper[4763]: I1006 17:14:06.575319 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:14:06 crc kubenswrapper[4763]: E1006 17:14:06.575973 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:14:14 crc kubenswrapper[4763]: I1006 17:14:14.952284 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp_f596e576-84ee-4cf9-9a96-c342d16ece3b/util/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.109879 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp_f596e576-84ee-4cf9-9a96-c342d16ece3b/util/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.130564 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp_f596e576-84ee-4cf9-9a96-c342d16ece3b/pull/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.150893 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp_f596e576-84ee-4cf9-9a96-c342d16ece3b/pull/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.301753 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp_f596e576-84ee-4cf9-9a96-c342d16ece3b/util/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.357540 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp_f596e576-84ee-4cf9-9a96-c342d16ece3b/extract/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.357926 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69wb2pp_f596e576-84ee-4cf9-9a96-c342d16ece3b/pull/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.479697 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8_14a63df7-b6aa-47b5-a09c-0c8b4af726df/util/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.629569 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8_14a63df7-b6aa-47b5-a09c-0c8b4af726df/util/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.657338 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8_14a63df7-b6aa-47b5-a09c-0c8b4af726df/pull/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.667871 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8_14a63df7-b6aa-47b5-a09c-0c8b4af726df/pull/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.822301 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8_14a63df7-b6aa-47b5-a09c-0c8b4af726df/util/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.843771 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8_14a63df7-b6aa-47b5-a09c-0c8b4af726df/extract/0.log" Oct 06 17:14:15 crc kubenswrapper[4763]: I1006 17:14:15.857953 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d27kkq8_14a63df7-b6aa-47b5-a09c-0c8b4af726df/pull/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.022071 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg_a27c468f-169a-43cb-8cb4-9265e9d62b63/util/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.139139 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg_a27c468f-169a-43cb-8cb4-9265e9d62b63/util/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.146579 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg_a27c468f-169a-43cb-8cb4-9265e9d62b63/pull/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.148974 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg_a27c468f-169a-43cb-8cb4-9265e9d62b63/pull/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.320119 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg_a27c468f-169a-43cb-8cb4-9265e9d62b63/pull/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.323936 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg_a27c468f-169a-43cb-8cb4-9265e9d62b63/extract/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.330399 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dctmqg_a27c468f-169a-43cb-8cb4-9265e9d62b63/util/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.517435 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c24vp_f6522382-5c77-4a83-9955-edec44f773dc/extract-utilities/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.656042 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c24vp_f6522382-5c77-4a83-9955-edec44f773dc/extract-utilities/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.661570 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c24vp_f6522382-5c77-4a83-9955-edec44f773dc/extract-content/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.681100 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c24vp_f6522382-5c77-4a83-9955-edec44f773dc/extract-content/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.834396 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c24vp_f6522382-5c77-4a83-9955-edec44f773dc/extract-content/0.log" Oct 06 17:14:16 crc kubenswrapper[4763]: I1006 17:14:16.865451 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c24vp_f6522382-5c77-4a83-9955-edec44f773dc/extract-utilities/0.log" Oct 06 17:14:17 crc kubenswrapper[4763]: I1006 17:14:17.033917 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7xjg2_d033ece4-a5ff-4492-92e9-2a2af22d0ce9/extract-utilities/0.log" Oct 06 17:14:17 crc kubenswrapper[4763]: I1006 17:14:17.293189 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7xjg2_d033ece4-a5ff-4492-92e9-2a2af22d0ce9/extract-content/0.log" Oct 06 17:14:17 crc kubenswrapper[4763]: I1006 17:14:17.300066 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7xjg2_d033ece4-a5ff-4492-92e9-2a2af22d0ce9/extract-utilities/0.log" Oct 06 17:14:17 crc kubenswrapper[4763]: I1006 17:14:17.371635 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7xjg2_d033ece4-a5ff-4492-92e9-2a2af22d0ce9/extract-content/0.log" Oct 06 17:14:17 crc kubenswrapper[4763]: I1006 17:14:17.533459 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7xjg2_d033ece4-a5ff-4492-92e9-2a2af22d0ce9/extract-content/0.log" Oct 06 17:14:17 crc kubenswrapper[4763]: I1006 17:14:17.543587 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7xjg2_d033ece4-a5ff-4492-92e9-2a2af22d0ce9/extract-utilities/0.log" Oct 06 17:14:17 crc kubenswrapper[4763]: I1006 17:14:17.737986 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_04a1458d-6c4f-4a7a-8394-b10333d04d20/util/0.log" Oct 06 17:14:17 crc kubenswrapper[4763]: I1006 17:14:17.961433 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_04a1458d-6c4f-4a7a-8394-b10333d04d20/util/0.log" Oct 06 17:14:18 crc kubenswrapper[4763]: I1006 17:14:18.035303 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_04a1458d-6c4f-4a7a-8394-b10333d04d20/pull/0.log" Oct 06 17:14:18 crc kubenswrapper[4763]: I1006 17:14:18.177373 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_04a1458d-6c4f-4a7a-8394-b10333d04d20/pull/0.log" Oct 06 17:14:18 crc kubenswrapper[4763]: I1006 17:14:18.368072 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_04a1458d-6c4f-4a7a-8394-b10333d04d20/util/0.log" Oct 06 17:14:18 crc kubenswrapper[4763]: I1006 17:14:18.437204 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_04a1458d-6c4f-4a7a-8394-b10333d04d20/pull/0.log" Oct 06 17:14:18 crc kubenswrapper[4763]: I1006 17:14:18.439747 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj6pvl_04a1458d-6c4f-4a7a-8394-b10333d04d20/extract/0.log" Oct 06 17:14:18 crc kubenswrapper[4763]: I1006 17:14:18.638131 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c24vp_f6522382-5c77-4a83-9955-edec44f773dc/registry-server/0.log" Oct 06 17:14:18 crc kubenswrapper[4763]: I1006 17:14:18.673823 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-dwnx6_dfcf8a62-1f4d-4af8-8bec-5eabdba0df59/marketplace-operator/0.log" Oct 06 17:14:18 crc kubenswrapper[4763]: I1006 17:14:18.819718 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-66hlg_b659ecc7-c237-4169-8512-3f4e9ad3133b/extract-utilities/0.log" Oct 06 17:14:18 crc kubenswrapper[4763]: I1006 17:14:18.993194 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-66hlg_b659ecc7-c237-4169-8512-3f4e9ad3133b/extract-utilities/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.018007 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-66hlg_b659ecc7-c237-4169-8512-3f4e9ad3133b/extract-content/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.069141 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-66hlg_b659ecc7-c237-4169-8512-3f4e9ad3133b/extract-content/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.193751 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7xjg2_d033ece4-a5ff-4492-92e9-2a2af22d0ce9/registry-server/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.277048 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-66hlg_b659ecc7-c237-4169-8512-3f4e9ad3133b/extract-utilities/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.332729 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-66hlg_b659ecc7-c237-4169-8512-3f4e9ad3133b/extract-content/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.398226 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rj6qt_f0874cb8-27ec-46e8-b17d-f50e1a6c63ea/extract-utilities/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.575273 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:14:19 crc kubenswrapper[4763]: E1006 17:14:19.575900 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.580211 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rj6qt_f0874cb8-27ec-46e8-b17d-f50e1a6c63ea/extract-utilities/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.591001 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-66hlg_b659ecc7-c237-4169-8512-3f4e9ad3133b/registry-server/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.600598 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rj6qt_f0874cb8-27ec-46e8-b17d-f50e1a6c63ea/extract-content/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.631065 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rj6qt_f0874cb8-27ec-46e8-b17d-f50e1a6c63ea/extract-content/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.758591 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rj6qt_f0874cb8-27ec-46e8-b17d-f50e1a6c63ea/extract-utilities/0.log" Oct 06 17:14:19 crc kubenswrapper[4763]: I1006 17:14:19.787166 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rj6qt_f0874cb8-27ec-46e8-b17d-f50e1a6c63ea/extract-content/0.log" Oct 06 17:14:20 crc kubenswrapper[4763]: I1006 17:14:20.626829 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rj6qt_f0874cb8-27ec-46e8-b17d-f50e1a6c63ea/registry-server/0.log" Oct 06 17:14:32 crc kubenswrapper[4763]: I1006 17:14:32.322674 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-p7j68_10de2cae-960d-44d9-aa93-1daddae117ed/prometheus-operator/0.log" Oct 06 17:14:32 crc kubenswrapper[4763]: I1006 17:14:32.477487 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-654cd64fd6-8szn6_a462d496-370d-4f61-adf3-ce12c97ef3be/prometheus-operator-admission-webhook/0.log" Oct 06 17:14:32 crc kubenswrapper[4763]: I1006 17:14:32.530908 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-654cd64fd6-lmkcg_8ae1ce1b-34cc-4085-b248-4e17a14b098b/prometheus-operator-admission-webhook/0.log" Oct 06 17:14:32 crc kubenswrapper[4763]: I1006 17:14:32.575459 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:14:32 crc kubenswrapper[4763]: E1006 17:14:32.575912 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9g2sw_openshift-machine-config-operator(4c91c0c6-f031-4840-bc66-ab38e8fb67c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" Oct 06 17:14:32 crc kubenswrapper[4763]: I1006 17:14:32.644459 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-mhpbv_b2ac9613-496b-40ce-b728-8ba688e8333c/operator/0.log" Oct 06 17:14:32 crc kubenswrapper[4763]: I1006 17:14:32.704250 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-7sql4_ecd776ba-98eb-4204-939c-7ac1f4cd216a/perses-operator/0.log" Oct 06 17:14:39 crc kubenswrapper[4763]: E1006 17:14:39.185566 4763 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.233:52736->38.102.83.233:44549: read tcp 38.102.83.233:52736->38.102.83.233:44549: read: connection reset by peer Oct 06 17:14:43 crc kubenswrapper[4763]: I1006 17:14:43.586737 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4" Oct 06 17:14:44 crc kubenswrapper[4763]: I1006 17:14:44.169226 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"78122d4a4b1a38964c34d3e4aeb5fd4ecebf34cb609a41e7e87f3e6de70968f1"} Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.153047 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb"] Oct 06 17:15:00 crc kubenswrapper[4763]: E1006 17:15:00.154384 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddbe1369-03e3-4863-a453-74f4b5ab6403" containerName="container-00" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.154406 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddbe1369-03e3-4863-a453-74f4b5ab6403" containerName="container-00" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.154860 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddbe1369-03e3-4863-a453-74f4b5ab6403" containerName="container-00" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.156062 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.157870 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.160635 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.163497 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb"] Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.227946 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvg45\" (UniqueName: \"kubernetes.io/projected/70df388b-dd78-4135-8919-c6f54862352f-kube-api-access-kvg45\") pod \"collect-profiles-29329515-z9qhb\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.228172 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70df388b-dd78-4135-8919-c6f54862352f-secret-volume\") pod \"collect-profiles-29329515-z9qhb\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.228386 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70df388b-dd78-4135-8919-c6f54862352f-config-volume\") pod \"collect-profiles-29329515-z9qhb\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.330295 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70df388b-dd78-4135-8919-c6f54862352f-secret-volume\") pod \"collect-profiles-29329515-z9qhb\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.330387 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70df388b-dd78-4135-8919-c6f54862352f-config-volume\") pod \"collect-profiles-29329515-z9qhb\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.330478 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvg45\" (UniqueName: \"kubernetes.io/projected/70df388b-dd78-4135-8919-c6f54862352f-kube-api-access-kvg45\") pod \"collect-profiles-29329515-z9qhb\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.331464 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70df388b-dd78-4135-8919-c6f54862352f-config-volume\") pod \"collect-profiles-29329515-z9qhb\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.337103 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70df388b-dd78-4135-8919-c6f54862352f-secret-volume\") pod \"collect-profiles-29329515-z9qhb\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.350099 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvg45\" (UniqueName: \"kubernetes.io/projected/70df388b-dd78-4135-8919-c6f54862352f-kube-api-access-kvg45\") pod \"collect-profiles-29329515-z9qhb\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.485587 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:00 crc kubenswrapper[4763]: I1006 17:15:00.977429 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb"] Oct 06 17:15:00 crc kubenswrapper[4763]: W1006 17:15:00.990386 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70df388b_dd78_4135_8919_c6f54862352f.slice/crio-630403750b66719c3355b91d82502acf8e2d597ff61794bc7ce1aabf13420ef3 WatchSource:0}: Error finding container 630403750b66719c3355b91d82502acf8e2d597ff61794bc7ce1aabf13420ef3: Status 404 returned error can't find the container with id 630403750b66719c3355b91d82502acf8e2d597ff61794bc7ce1aabf13420ef3 Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.326684 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" event={"ID":"70df388b-dd78-4135-8919-c6f54862352f","Type":"ContainerStarted","Data":"ca40a510055b3ac32184d79e65fc31e1899dd194659e4b6b24cbcf8a9b36f1c5"} Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.326958 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" event={"ID":"70df388b-dd78-4135-8919-c6f54862352f","Type":"ContainerStarted","Data":"630403750b66719c3355b91d82502acf8e2d597ff61794bc7ce1aabf13420ef3"} Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.457451 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bhr79"] Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.459575 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.472433 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhr79"] Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.556609 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-utilities\") pod \"community-operators-bhr79\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.556873 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpnnj\" (UniqueName: \"kubernetes.io/projected/26e89535-e72f-4449-b89c-5660cc8d0a5c-kube-api-access-rpnnj\") pod \"community-operators-bhr79\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.556923 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-catalog-content\") pod \"community-operators-bhr79\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.658745 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpnnj\" (UniqueName: \"kubernetes.io/projected/26e89535-e72f-4449-b89c-5660cc8d0a5c-kube-api-access-rpnnj\") pod \"community-operators-bhr79\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.658803 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-catalog-content\") pod \"community-operators-bhr79\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.658923 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-utilities\") pod \"community-operators-bhr79\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.659390 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-catalog-content\") pod \"community-operators-bhr79\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.659453 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-utilities\") pod \"community-operators-bhr79\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.676814 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpnnj\" (UniqueName: \"kubernetes.io/projected/26e89535-e72f-4449-b89c-5660cc8d0a5c-kube-api-access-rpnnj\") pod \"community-operators-bhr79\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:01 crc kubenswrapper[4763]: I1006 17:15:01.777360 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:02 crc kubenswrapper[4763]: I1006 17:15:02.330428 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhr79"] Oct 06 17:15:02 crc kubenswrapper[4763]: I1006 17:15:02.398444 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" podStartSLOduration=2.398428079 podStartE2EDuration="2.398428079s" podCreationTimestamp="2025-10-06 17:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 17:15:02.394844312 +0000 UTC m=+8499.550136824" watchObservedRunningTime="2025-10-06 17:15:02.398428079 +0000 UTC m=+8499.553720591" Oct 06 17:15:03 crc kubenswrapper[4763]: I1006 17:15:03.381583 4763 generic.go:334] "Generic (PLEG): container finished" podID="70df388b-dd78-4135-8919-c6f54862352f" containerID="ca40a510055b3ac32184d79e65fc31e1899dd194659e4b6b24cbcf8a9b36f1c5" exitCode=0 Oct 06 17:15:03 crc kubenswrapper[4763]: I1006 17:15:03.381679 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" event={"ID":"70df388b-dd78-4135-8919-c6f54862352f","Type":"ContainerDied","Data":"ca40a510055b3ac32184d79e65fc31e1899dd194659e4b6b24cbcf8a9b36f1c5"} Oct 06 17:15:03 crc kubenswrapper[4763]: I1006 17:15:03.384033 4763 generic.go:334] "Generic (PLEG): container finished" podID="26e89535-e72f-4449-b89c-5660cc8d0a5c" containerID="c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095" exitCode=0 Oct 06 17:15:03 crc kubenswrapper[4763]: I1006 17:15:03.384077 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhr79" event={"ID":"26e89535-e72f-4449-b89c-5660cc8d0a5c","Type":"ContainerDied","Data":"c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095"} Oct 06 17:15:03 crc kubenswrapper[4763]: I1006 17:15:03.384103 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhr79" event={"ID":"26e89535-e72f-4449-b89c-5660cc8d0a5c","Type":"ContainerStarted","Data":"6a3ea3a842be4d3389de6b19878bc618916928e52120f529105d32333b2c89a9"} Oct 06 17:15:03 crc kubenswrapper[4763]: I1006 17:15:03.386502 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 17:15:04 crc kubenswrapper[4763]: I1006 17:15:04.792628 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:04 crc kubenswrapper[4763]: I1006 17:15:04.942995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvg45\" (UniqueName: \"kubernetes.io/projected/70df388b-dd78-4135-8919-c6f54862352f-kube-api-access-kvg45\") pod \"70df388b-dd78-4135-8919-c6f54862352f\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " Oct 06 17:15:04 crc kubenswrapper[4763]: I1006 17:15:04.943395 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70df388b-dd78-4135-8919-c6f54862352f-config-volume\") pod \"70df388b-dd78-4135-8919-c6f54862352f\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " Oct 06 17:15:04 crc kubenswrapper[4763]: I1006 17:15:04.943455 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70df388b-dd78-4135-8919-c6f54862352f-secret-volume\") pod \"70df388b-dd78-4135-8919-c6f54862352f\" (UID: \"70df388b-dd78-4135-8919-c6f54862352f\") " Oct 06 17:15:04 crc kubenswrapper[4763]: I1006 17:15:04.944876 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70df388b-dd78-4135-8919-c6f54862352f-config-volume" (OuterVolumeSpecName: "config-volume") pod "70df388b-dd78-4135-8919-c6f54862352f" (UID: "70df388b-dd78-4135-8919-c6f54862352f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 17:15:04 crc kubenswrapper[4763]: I1006 17:15:04.948850 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70df388b-dd78-4135-8919-c6f54862352f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "70df388b-dd78-4135-8919-c6f54862352f" (UID: "70df388b-dd78-4135-8919-c6f54862352f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 17:15:04 crc kubenswrapper[4763]: I1006 17:15:04.950151 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70df388b-dd78-4135-8919-c6f54862352f-kube-api-access-kvg45" (OuterVolumeSpecName: "kube-api-access-kvg45") pod "70df388b-dd78-4135-8919-c6f54862352f" (UID: "70df388b-dd78-4135-8919-c6f54862352f"). InnerVolumeSpecName "kube-api-access-kvg45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:15:05 crc kubenswrapper[4763]: I1006 17:15:05.045798 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvg45\" (UniqueName: \"kubernetes.io/projected/70df388b-dd78-4135-8919-c6f54862352f-kube-api-access-kvg45\") on node \"crc\" DevicePath \"\"" Oct 06 17:15:05 crc kubenswrapper[4763]: I1006 17:15:05.045829 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70df388b-dd78-4135-8919-c6f54862352f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 17:15:05 crc kubenswrapper[4763]: I1006 17:15:05.045839 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70df388b-dd78-4135-8919-c6f54862352f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 17:15:05 crc kubenswrapper[4763]: I1006 17:15:05.416629 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" event={"ID":"70df388b-dd78-4135-8919-c6f54862352f","Type":"ContainerDied","Data":"630403750b66719c3355b91d82502acf8e2d597ff61794bc7ce1aabf13420ef3"} Oct 06 17:15:05 crc kubenswrapper[4763]: I1006 17:15:05.416677 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="630403750b66719c3355b91d82502acf8e2d597ff61794bc7ce1aabf13420ef3" Oct 06 17:15:05 crc kubenswrapper[4763]: I1006 17:15:05.416748 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329515-z9qhb" Oct 06 17:15:05 crc kubenswrapper[4763]: I1006 17:15:05.475439 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n"] Oct 06 17:15:05 crc kubenswrapper[4763]: I1006 17:15:05.483306 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329470-h5p9n"] Oct 06 17:15:05 crc kubenswrapper[4763]: I1006 17:15:05.587603 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45855e5d-ec6c-429b-8792-ddd939f5b4db" path="/var/lib/kubelet/pods/45855e5d-ec6c-429b-8792-ddd939f5b4db/volumes" Oct 06 17:15:06 crc kubenswrapper[4763]: I1006 17:15:06.428575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhr79" event={"ID":"26e89535-e72f-4449-b89c-5660cc8d0a5c","Type":"ContainerStarted","Data":"b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282"} Oct 06 17:15:08 crc kubenswrapper[4763]: I1006 17:15:08.448937 4763 generic.go:334] "Generic (PLEG): container finished" podID="26e89535-e72f-4449-b89c-5660cc8d0a5c" containerID="b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282" exitCode=0 Oct 06 17:15:08 crc kubenswrapper[4763]: I1006 17:15:08.448967 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhr79" event={"ID":"26e89535-e72f-4449-b89c-5660cc8d0a5c","Type":"ContainerDied","Data":"b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282"} Oct 06 17:15:09 crc kubenswrapper[4763]: I1006 17:15:09.460879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhr79" event={"ID":"26e89535-e72f-4449-b89c-5660cc8d0a5c","Type":"ContainerStarted","Data":"819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022"} Oct 06 17:15:09 crc kubenswrapper[4763]: I1006 17:15:09.482084 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bhr79" podStartSLOduration=3.036151363 podStartE2EDuration="8.482064359s" podCreationTimestamp="2025-10-06 17:15:01 +0000 UTC" firstStartedPulling="2025-10-06 17:15:03.38629188 +0000 UTC m=+8500.541584392" lastFinishedPulling="2025-10-06 17:15:08.832204876 +0000 UTC m=+8505.987497388" observedRunningTime="2025-10-06 17:15:09.478505433 +0000 UTC m=+8506.633797945" watchObservedRunningTime="2025-10-06 17:15:09.482064359 +0000 UTC m=+8506.637356871" Oct 06 17:15:11 crc kubenswrapper[4763]: I1006 17:15:11.778106 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:11 crc kubenswrapper[4763]: I1006 17:15:11.778725 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:11 crc kubenswrapper[4763]: I1006 17:15:11.875230 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:21 crc kubenswrapper[4763]: I1006 17:15:21.862609 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:21 crc kubenswrapper[4763]: I1006 17:15:21.928175 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bhr79"] Oct 06 17:15:22 crc kubenswrapper[4763]: I1006 17:15:22.626248 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bhr79" podUID="26e89535-e72f-4449-b89c-5660cc8d0a5c" containerName="registry-server" containerID="cri-o://819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022" gracePeriod=2 Oct 06 17:15:22 crc kubenswrapper[4763]: E1006 17:15:22.950046 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e89535_e72f_4449_b89c_5660cc8d0a5c.slice/crio-conmon-819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022.scope\": RecentStats: unable to find data in memory cache]" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.187336 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.289449 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-catalog-content\") pod \"26e89535-e72f-4449-b89c-5660cc8d0a5c\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.289638 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-utilities\") pod \"26e89535-e72f-4449-b89c-5660cc8d0a5c\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.289755 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpnnj\" (UniqueName: \"kubernetes.io/projected/26e89535-e72f-4449-b89c-5660cc8d0a5c-kube-api-access-rpnnj\") pod \"26e89535-e72f-4449-b89c-5660cc8d0a5c\" (UID: \"26e89535-e72f-4449-b89c-5660cc8d0a5c\") " Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.290753 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-utilities" (OuterVolumeSpecName: "utilities") pod "26e89535-e72f-4449-b89c-5660cc8d0a5c" (UID: "26e89535-e72f-4449-b89c-5660cc8d0a5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.299886 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e89535-e72f-4449-b89c-5660cc8d0a5c-kube-api-access-rpnnj" (OuterVolumeSpecName: "kube-api-access-rpnnj") pod "26e89535-e72f-4449-b89c-5660cc8d0a5c" (UID: "26e89535-e72f-4449-b89c-5660cc8d0a5c"). InnerVolumeSpecName "kube-api-access-rpnnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.351113 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26e89535-e72f-4449-b89c-5660cc8d0a5c" (UID: "26e89535-e72f-4449-b89c-5660cc8d0a5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.393135 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpnnj\" (UniqueName: \"kubernetes.io/projected/26e89535-e72f-4449-b89c-5660cc8d0a5c-kube-api-access-rpnnj\") on node \"crc\" DevicePath \"\"" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.393170 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.393184 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e89535-e72f-4449-b89c-5660cc8d0a5c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.640588 4763 generic.go:334] "Generic (PLEG): container finished" podID="26e89535-e72f-4449-b89c-5660cc8d0a5c" containerID="819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022" exitCode=0 Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.640652 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhr79" event={"ID":"26e89535-e72f-4449-b89c-5660cc8d0a5c","Type":"ContainerDied","Data":"819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022"} Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.640683 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhr79" event={"ID":"26e89535-e72f-4449-b89c-5660cc8d0a5c","Type":"ContainerDied","Data":"6a3ea3a842be4d3389de6b19878bc618916928e52120f529105d32333b2c89a9"} Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.640683 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhr79" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.640704 4763 scope.go:117] "RemoveContainer" containerID="819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.683298 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bhr79"] Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.688777 4763 scope.go:117] "RemoveContainer" containerID="b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.694459 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bhr79"] Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.732091 4763 scope.go:117] "RemoveContainer" containerID="c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.797113 4763 scope.go:117] "RemoveContainer" containerID="819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022" Oct 06 17:15:23 crc kubenswrapper[4763]: E1006 17:15:23.797851 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022\": container with ID starting with 819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022 not found: ID does not exist" containerID="819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.797896 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022"} err="failed to get container status \"819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022\": rpc error: code = NotFound desc = could not find container \"819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022\": container with ID starting with 819196258a4e2ee1d250204acf2decac4f758d5cf44956164ceb1de1f0997022 not found: ID does not exist" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.797926 4763 scope.go:117] "RemoveContainer" containerID="b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282" Oct 06 17:15:23 crc kubenswrapper[4763]: E1006 17:15:23.798368 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282\": container with ID starting with b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282 not found: ID does not exist" containerID="b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.798406 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282"} err="failed to get container status \"b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282\": rpc error: code = NotFound desc = could not find container \"b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282\": container with ID starting with b4fbb1c798623dc717cd4b50ed2a3e4cd579046f7bb2e9a708ad7caddc006282 not found: ID does not exist" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.798436 4763 scope.go:117] "RemoveContainer" containerID="c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095" Oct 06 17:15:23 crc kubenswrapper[4763]: E1006 17:15:23.798792 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095\": container with ID starting with c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095 not found: ID does not exist" containerID="c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095" Oct 06 17:15:23 crc kubenswrapper[4763]: I1006 17:15:23.798821 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095"} err="failed to get container status \"c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095\": rpc error: code = NotFound desc = could not find container \"c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095\": container with ID starting with c487e96f45e8a6e9bdd37dcac4423314fdad666e786a0511e3e0c879db14e095 not found: ID does not exist" Oct 06 17:15:25 crc kubenswrapper[4763]: I1006 17:15:25.600674 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e89535-e72f-4449-b89c-5660cc8d0a5c" path="/var/lib/kubelet/pods/26e89535-e72f-4449-b89c-5660cc8d0a5c/volumes" Oct 06 17:15:38 crc kubenswrapper[4763]: I1006 17:15:38.848988 4763 scope.go:117] "RemoveContainer" containerID="80d4a263df705624b3d550a8e9a69ccc5bfa95b4708485568795b58b7fe04f32" Oct 06 17:16:54 crc kubenswrapper[4763]: I1006 17:16:54.788444 4763 generic.go:334] "Generic (PLEG): container finished" podID="5d3ed696-244b-4625-aa3b-74471b9c059e" containerID="f14281a272c6d6eb499b74acfa60628bba07d84be3da354162868356c12b9133" exitCode=0 Oct 06 17:16:54 crc kubenswrapper[4763]: I1006 17:16:54.788525 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwsqt/must-gather-vk9mz" event={"ID":"5d3ed696-244b-4625-aa3b-74471b9c059e","Type":"ContainerDied","Data":"f14281a272c6d6eb499b74acfa60628bba07d84be3da354162868356c12b9133"} Oct 06 17:16:54 crc kubenswrapper[4763]: I1006 17:16:54.790066 4763 scope.go:117] "RemoveContainer" containerID="f14281a272c6d6eb499b74acfa60628bba07d84be3da354162868356c12b9133" Oct 06 17:16:54 crc kubenswrapper[4763]: I1006 17:16:54.981798 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vwsqt_must-gather-vk9mz_5d3ed696-244b-4625-aa3b-74471b9c059e/gather/0.log" Oct 06 17:17:03 crc kubenswrapper[4763]: I1006 17:17:03.649660 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vwsqt/must-gather-vk9mz"] Oct 06 17:17:03 crc kubenswrapper[4763]: I1006 17:17:03.650599 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vwsqt/must-gather-vk9mz" podUID="5d3ed696-244b-4625-aa3b-74471b9c059e" containerName="copy" containerID="cri-o://1295a6c4a8ac0ef04b65fa50f2f7318d754908a78113e8c7666792acbb4d0e1c" gracePeriod=2 Oct 06 17:17:03 crc kubenswrapper[4763]: I1006 17:17:03.668080 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vwsqt/must-gather-vk9mz"] Oct 06 17:17:03 crc kubenswrapper[4763]: I1006 17:17:03.876456 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 17:17:03 crc kubenswrapper[4763]: I1006 17:17:03.876825 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 17:17:03 crc kubenswrapper[4763]: I1006 17:17:03.893819 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vwsqt_must-gather-vk9mz_5d3ed696-244b-4625-aa3b-74471b9c059e/copy/0.log" Oct 06 17:17:03 crc kubenswrapper[4763]: I1006 17:17:03.894505 4763 generic.go:334] "Generic (PLEG): container finished" podID="5d3ed696-244b-4625-aa3b-74471b9c059e" containerID="1295a6c4a8ac0ef04b65fa50f2f7318d754908a78113e8c7666792acbb4d0e1c" exitCode=143 Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.184510 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vwsqt_must-gather-vk9mz_5d3ed696-244b-4625-aa3b-74471b9c059e/copy/0.log" Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.185252 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/must-gather-vk9mz" Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.358675 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcwbh\" (UniqueName: \"kubernetes.io/projected/5d3ed696-244b-4625-aa3b-74471b9c059e-kube-api-access-gcwbh\") pod \"5d3ed696-244b-4625-aa3b-74471b9c059e\" (UID: \"5d3ed696-244b-4625-aa3b-74471b9c059e\") " Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.358899 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5d3ed696-244b-4625-aa3b-74471b9c059e-must-gather-output\") pod \"5d3ed696-244b-4625-aa3b-74471b9c059e\" (UID: \"5d3ed696-244b-4625-aa3b-74471b9c059e\") " Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.395816 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3ed696-244b-4625-aa3b-74471b9c059e-kube-api-access-gcwbh" (OuterVolumeSpecName: "kube-api-access-gcwbh") pod "5d3ed696-244b-4625-aa3b-74471b9c059e" (UID: "5d3ed696-244b-4625-aa3b-74471b9c059e"). InnerVolumeSpecName "kube-api-access-gcwbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.461949 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcwbh\" (UniqueName: \"kubernetes.io/projected/5d3ed696-244b-4625-aa3b-74471b9c059e-kube-api-access-gcwbh\") on node \"crc\" DevicePath \"\"" Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.579680 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3ed696-244b-4625-aa3b-74471b9c059e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5d3ed696-244b-4625-aa3b-74471b9c059e" (UID: "5d3ed696-244b-4625-aa3b-74471b9c059e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.666735 4763 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5d3ed696-244b-4625-aa3b-74471b9c059e-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.928423 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vwsqt_must-gather-vk9mz_5d3ed696-244b-4625-aa3b-74471b9c059e/copy/0.log" Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.932103 4763 scope.go:117] "RemoveContainer" containerID="1295a6c4a8ac0ef04b65fa50f2f7318d754908a78113e8c7666792acbb4d0e1c" Oct 06 17:17:04 crc kubenswrapper[4763]: I1006 17:17:04.932249 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwsqt/must-gather-vk9mz" Oct 06 17:17:05 crc kubenswrapper[4763]: I1006 17:17:05.009154 4763 scope.go:117] "RemoveContainer" containerID="f14281a272c6d6eb499b74acfa60628bba07d84be3da354162868356c12b9133" Oct 06 17:17:05 crc kubenswrapper[4763]: I1006 17:17:05.623734 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3ed696-244b-4625-aa3b-74471b9c059e" path="/var/lib/kubelet/pods/5d3ed696-244b-4625-aa3b-74471b9c059e/volumes" Oct 06 17:17:33 crc kubenswrapper[4763]: I1006 17:17:33.877461 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 17:17:33 crc kubenswrapper[4763]: I1006 17:17:33.878106 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 17:18:03 crc kubenswrapper[4763]: I1006 17:18:03.876655 4763 patch_prober.go:28] interesting pod/machine-config-daemon-9g2sw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 17:18:03 crc kubenswrapper[4763]: I1006 17:18:03.877227 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 17:18:03 crc kubenswrapper[4763]: I1006 17:18:03.877277 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" Oct 06 17:18:03 crc kubenswrapper[4763]: I1006 17:18:03.878122 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78122d4a4b1a38964c34d3e4aeb5fd4ecebf34cb609a41e7e87f3e6de70968f1"} pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 17:18:03 crc kubenswrapper[4763]: I1006 17:18:03.878179 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" podUID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerName="machine-config-daemon" containerID="cri-o://78122d4a4b1a38964c34d3e4aeb5fd4ecebf34cb609a41e7e87f3e6de70968f1" gracePeriod=600 Oct 06 17:18:04 crc kubenswrapper[4763]: I1006 17:18:04.629928 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c91c0c6-f031-4840-bc66-ab38e8fb67c7" containerID="78122d4a4b1a38964c34d3e4aeb5fd4ecebf34cb609a41e7e87f3e6de70968f1" exitCode=0 Oct 06 17:18:04 crc kubenswrapper[4763]: I1006 17:18:04.630465 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerDied","Data":"78122d4a4b1a38964c34d3e4aeb5fd4ecebf34cb609a41e7e87f3e6de70968f1"} Oct 06 17:18:04 crc kubenswrapper[4763]: I1006 17:18:04.630490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9g2sw" event={"ID":"4c91c0c6-f031-4840-bc66-ab38e8fb67c7","Type":"ContainerStarted","Data":"674790cd748c065d62a40771f3f7d2808bd8b15a17760eb555d49ee655d1c7d6"} Oct 06 17:18:04 crc kubenswrapper[4763]: I1006 17:18:04.630522 4763 scope.go:117] "RemoveContainer" containerID="c8a953962efe7e3a72255e2fd1e02274657a9ed2f85e1c1d02916afab8496bd4"